WorldWideScience

Sample records for a codes

  1. Code Generation = A* + BURS

    NARCIS (Netherlands)

    Nymeyer, Albert; Katoen, Joost P.; Westra, Ymte; Alblas, H.; Gyimóthy, Tibor

    1996-01-01

    A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived

  2. A Message Without a Code?

    Directory of Open Access Journals (Sweden)

    Tom Conley

    1981-01-01

    Full Text Available The photographic paradox is said to be that of a message without a code, a communication lacking a relay or gap essential to the process of communication. Tracing the recurrence of Barthes's definition in the essays included in Image/Music/Text and in La Chambre claire , this paper argues that Barthes's definition is platonic in its will to dematerialize the troubling — graphic — immediacy of the photograph. He writes of the image in order to flee its signature. As a function of media, his categories are written in order to be insufficient and inadequate; to maintain an ineluctable difference between language heard and letters seen; to protect an idiom of loss which the photograph disallows. The article studies the strategies of his definition in «The Photographic Paradox» as instrument of abstraction, opposes the notion of code, in an aural sense, to audio-visual markers of closed relay in advertising, and critiques the layout and order of La Chambre claire in respect to Barthes's ideology of absence.

  3. Requirements of a Better Secure Program Coding

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2012-01-01

    Full Text Available Secure program coding refers to how manage the risks determined by the security breaches because of the program source code. The papers reviews the best practices must be doing during the software development life cycle for secure software assurance, the methods and techniques used for a secure coding assurance, the most known and common vulnerabilities determined by a bad coding process and how the security risks are managed and mitigated. As a tool of the better secure program coding, the code review process is presented, together with objective measures for code review assurance and estimation of the effort for the code improvement.

  4. A genetic scale of reading frame coding.

    Science.gov (United States)

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier

  5. HADES, A Radiographic Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M.B.; Slone, D.M.; Schach von Wittenau, A.E.

    2000-08-18

    We describe features of the HADES radiographic simulation code. We begin with a discussion of why it is useful to simulate transmission radiography. The capabilities of HADES are described, followed by an application of HADES to a dynamic experiment recently performed at the Los Alamos Neutron Science Center. We describe quantitative comparisons between experimental data and HADES simulations using a copper step wedge. We conclude with a short discussion of future work planned for HADES.

  6. A Connection between Network Coding and Convolutional Codes

    OpenAIRE

    Fragouli, C.; Soljanin, E.

    2004-01-01

    The min-cut, max-flow theorem states that a source node can send a commodity through a network to a sink node at the rate determined by the flow of the min-cut separating the source and the sink. Recently it has been shown that by liner re-encoding at nodes in communications networks, the min-cut rate can be also achieved in multicasting to several sinks. In this paper we discuss connections between such coding schemes and convolutional codes. We propose a method to simplify the convolutional...

  7. A case for a code of ethics.

    Science.gov (United States)

    Bayliss, P

    1994-03-01

    Ethical dilemmas in business and health have become a familiar topic over recent times. Doubts remain, however, as to whether a code should be produced and the recently issued IHSM consultation paper argues the case for "a statement of primary values" rather than a code of ethics. In a second article on the subject, Paul Bayliss examines the importance of having a code, looks at some of the contextual issues and suggests an approach to producing one.

  8. A Mobile Application Prototype using Network Coding

    DEFF Research Database (Denmark)

    Pedersen, Morten Videbæk; Heide, Janus; Fitzek, Frank

    2010-01-01

    This paper looks into implementation details of network coding for a mobile application running on commercial mobile phones. We describe the necessary coding operations and algorithms that implements them. The coding algorithms forms the basis for a implementation in C++ and Symbian C++. We report...... on practical measurement results of coding throughput and energy consumption for a single-source multiple-sinks network, with and without recoding at the sinks. These results confirm that network coding is practical even on computationally weak platforms, and that network coding potentially can be used...

  9. The chromatin regulatory code: Beyond a histone code

    Science.gov (United States)

    Lesne, A.

    2006-03-01

    In this commentary on the contribution by Arndt Benecke in this issue, I discuss why the notion of “chromatin code” introduced and elaborated in this paper is to be preferred to that of “histone code”. Speaking of a code as regards nucleosome conformation and histone tail post-translational modifications only makes sense within the chromatin fiber, where their physico-chemical features can be translated into regulatory programs at the genome level, by means of a complex, multi-level interplay with the fiber architecture and dynamics settled in the course of Evolution. In particular, this chromatin code presumably exploits allosteric transitions of the chromatin fiber. The chromatin structure dependence of its translation suggests two alternative modes of transcription initiation regulation, also proposed in the paper by A. Benecke in this issue for interpreting strikingly bimodal micro-array data.

  10. A class of Sudan-decodable codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2000-01-01

    In this article, Sudan's algorithm is modified into an efficient method to list-decode a class of codes which can be seen as a generalization of Reed-Solomon codes. The algorithm is specialized into a very efficient method for unique decoding. The code construction can be generalized based...

  11. Secrecy Gain: a Wiretap Lattice Code Design

    OpenAIRE

    Belfiore, Jean-Claude; Oggier, Frédérique

    2010-01-01

    We propose the notion of secrecy gain as a code design criterion for wiretap lattice codes to be used over an additive white Gaussian noise channel. Our analysis relies on the error probabilites of both the legitimate user and the eavesdropper. We focus on geometrical properties of lattices, described by their theta series, to characterize good wiretap codes.

  12. A (72, 36; 15) box code

    Science.gov (United States)

    Solomon, G.

    1993-01-01

    A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.

  13. New convolutional code constructions and a class of asymptotically good time-varying codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1973-01-01

    We show that the generator polynomials of certain cyclic codes define noncatastrophic fixed convolutional codes whose free distances are lowerbounded by the minimum distances of the cyclic codes. This result is used to construct convolutioual codes with free distance equal to the constraint length...... and to derive convolutional codes with good free distances from the BCH codes. Finally, a class of time-varying codes is constructed for which the free distance increases linearly with the constraint length....

  14. Schrödinger's code-script: not a genetic cipher but a code of development.

    Science.gov (United States)

    Walsby, A E; Hodge, M J S

    2017-06-01

    In his book What is Life? Erwin Schrödinger coined the term 'code-script', thought by some to be the first published suggestion of a hereditary code and perhaps a forerunner of the genetic code. The etymology of 'code' suggests three meanings relevant to 'code-script which we distinguish as 'cipher-code', 'word-code' and 'rule-code'. Cipher-codes and word-codes entail translation of one set of characters into another. The genetic code comprises not one but two cipher-codes: the first is the DNA 'base-pairing cipher'; the second is the 'nucleotide-amino-acid cipher', which involves the translation of DNA base sequences into amino-acid sequences. We suggest that Schrödinger's code-script is a form of 'rule-code', a set of rules that, like the 'highway code' or 'penal code', requires no translation of a message. Schrödinger first relates his code-script to chromosomal genes made of protein. Ignorant of its properties, however, he later abandons 'protein' and adopts in its place a hypothetical, isomeric 'aperiodic solid' whose atoms he imagines rearranged in countless different conformations, which together are responsible for the patterns of ontogenetic development. In an attempt to explain the large number of combinations required, Schrödinger referred to the Morse code (a cipher) but in doing so unwittingly misled readers into believing that he intended a cipher-code resembling the genetic code. We argue that the modern equivalent of Schrödinger's code-script is a rule-code of organismal development based largely on the synthesis, folding, properties and interactions of numerous proteins, each performing a specific task. Copyright © 2016. Published by Elsevier Ltd.

  15. Efficiency of a model human image code

    Science.gov (United States)

    Watson, Andrew B.

    1987-01-01

    Hypothetical schemes for neural representation of visual information can be expressed as explicit image codes. Here, a code modeled on the simple cells of the primate striate cortex is explored. The Cortex transform maps a digital image into a set of subimages (layers) that are bandpass in spatial frequency and orientation. The layers are sampled so as to minimize the number of samples and still avoid aliasing. Samples are quantized in a manner that exploits the bandpass contrast-masking properties of human vision. The entropy of the samples is computed to provide a lower bound on the code size. Finally, the image is reconstructed from the code. Psychophysical methods are derived for comparing the original and reconstructed images to evaluate the sufficiency of the code. When each resolution is coded at the threshold for detection artifacts, the image-code size is about 1 bit/pixel.

  16. Blue code: Is it a real emergency?

    Science.gov (United States)

    Eroglu, Serkan E; Onur, Ozge; Urgan, Oğuz; Denizbasi, Arzu; Akoglu, Haldun

    2014-01-01

    Cardiac arrests in hospital areas are common, and hospitals have rapid response teams or "blue code teams" to reduce preventable in-hospital deaths. Education about the rapid response team has been provided in all hospitals in Turkey, but true "blue code" activation is rare, and it is abused by medical personnel in practice. This study aimed to determine the cases of wrong blue codes and reasons of misuse. This retrospective study analyzed the blue code reports issued by our hospital between January 1 and June 1 2012. A total of 89 "blue code" activations were recorded in 5 months. A "blue code" was defined as any patient with an unexpected cardiac or respiratory arrest requiring resuscitation and activation of a hospital alert. Adherence to this definition, each physician classified their collected activation forms as either a true or a wrong code. Then, patient data entered a database (Microsoft Excel 2007 software) which was pooled for analysis. The data were analyzed by using frequencies and the Chi-square test on SPSSv16.0. The patients were diagnosed with cardiopulmonary arrest (8), change in mental status (18), presyncope (11), chest pain (12), conversive disorder (18), and worry of the staff for the patient (22). Code activation was done by physicians in 76% of the patients; the most common reason for blue code was concern of staff for the patient. The findings of this study show that more research is needed to establish the overall effectiveness and optimal implementation of blue code teams.

  17. Hierarchical Parallel Evaluation of a Hamming Code

    Directory of Open Access Journals (Sweden)

    Shmuel T. Klein

    2017-04-01

    Full Text Available The Hamming code is a well-known error correction code and can correct a single error in an input vector of size n bits by adding logn parity checks. A new parallel implementation of the code is presented, using a hierarchical structure of n processors in logn layers. All the processors perform similar simple tasks, and need only a few bytes of internal memory.

  18. P-code versus C/A-code GPS for range tracking applications

    Science.gov (United States)

    Hoefener, Carl E.; van Wechel, Bob

    This article compares the use of P-code and C/A-code GPS receivers on test and training ranges. The requirements on many ranges for operation under conditions of jamming preclude the use of C/A-code receivers because of their relatively low jamming immunity as compared with P-code receivers. Also, C/A-code receivers present some problems when used with pseudolites on ranges. The cost of P-code receivers is customarily much higher than that of C/A-code receivers. However, most of this difference is caused by factors other than P-code, particularly the parts screening specifications applied to military programs.

  19. Towards a testbed for malicious code detection

    Energy Technology Data Exchange (ETDEWEB)

    Lo, R.; Kerchen, P.; Crawford, R.; Ho, W.; Crossley, J.; Fink, G.; Levitt, K.; Olsson, R.; Archer, M. (California Univ., Davis, CA (USA). Div. of Computer Science)

    1991-01-01

    This paper proposes an environment for detecting many types of malicious code, including computer viruses, Trojan horses, and time/logic bombs. This malicious code testbed (MCT) is based upon both static and dynamic analysis tools developed at the University of California, Davis, which have been shown to be effective against certain types of malicious code. The testbed extends the usefulness of these tools by using them in a complementary fashion to detect more general cases of malicious code. Perhaps more importantly, the MCT allows administrators and security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict. 5 refs., 2 figs., 2 tabs.

  20. A Distributed Quaternary Turbo Coded Cooperative Scheme

    Directory of Open Access Journals (Sweden)

    BALDINI FILHO, R.

    2014-12-01

    Full Text Available Cooperative communications achieve MIMO-like diversity gains by introducing a relay that creates an independent faded path between the source and the destination. Coded cooperation integrates cooperation with channel coding in order to increase the bit error rate (BER performance of cooperative communications. Turbo codewords can be built efficiently at the destination using encoded portions of the information sent by the source and the relay. This paper presents a distributed turbo cooperative coding scheme that utilizes convolutional codes defined over the finite ring of integers Z4 that performs better than its equivalent binary counterparts.

  1. A Survey of Linear Network Coding and Network Error Correction Code Constructions and Algorithms

    Directory of Open Access Journals (Sweden)

    Michele Sanna

    2011-01-01

    Full Text Available Network coding was introduced by Ahlswede et al. in a pioneering work in 2000. This paradigm encompasses coding and retransmission of messages at the intermediate nodes of the network. In contrast with traditional store-and-forward networking, network coding increases the throughput and the robustness of the transmission. Linear network coding is a practical implementation of this new paradigm covered by several research works that include rate characterization, error-protection coding, and construction of codes. Especially determining the coding characteristics has its importance in providing the premise for an efficient transmission. In this paper, we review the recent breakthroughs in linear network coding for acyclic networks with a survey of code constructions literature. Deterministic construction algorithms and randomized procedures are presented for traditional network coding and for network-control network coding.

  2. Code Calibration as a Decision Problem

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1993-01-01

    Calibration of partial coefficients for a class of structures where no code exists is considered. The partial coefficients are determined such that the difference between the reliability for the different structures in the class considered and a target reliability level is minimized. Code calibra...

  3. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  4. Source Code Plagiarism--A Student Perspective

    Science.gov (United States)

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  5. A Simple Decoder for Topological Codes

    Directory of Open Access Journals (Sweden)

    James Wootton

    2015-04-01

    Full Text Available Here we study an efficient algorithm for decoding topological codes. It is a simple form of HDRG decoder, which could be straightforwardly generalized to complex decoding problems. Specific results are obtained for the planar code with both i.i.d. and spatially correlated errors. The method is shown to compare well with existing ones, despite its simplicity.

  6. The Nuremberg Code-A critique

    Directory of Open Access Journals (Sweden)

    Ravindra B Ghooi

    2011-01-01

    Full Text Available The Nuremberg Code drafted at the end of the Doctor′s trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.

  7. The Nuremberg Code-A critique.

    Science.gov (United States)

    Ghooi, Ravindra B

    2011-04-01

    The Nuremberg Code drafted at the end of the Doctor's trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.

  8. A thesaurus for a neural population code.

    Science.gov (United States)

    Ganmor, Elad; Segev, Ronen; Schneidman, Elad

    2015-09-08

    Information is carried in the brain by the joint spiking patterns of large groups of noisy, unreliable neurons. This noise limits the capacity of the neural code and determines how information can be transmitted and read-out. To accurately decode, the brain must overcome this noise and identify which patterns are semantically similar. We use models of network encoding noise to learn a thesaurus for populations of neurons in the vertebrate retina responding to artificial and natural videos, measuring the similarity between population responses to visual stimuli based on the information they carry. This thesaurus reveals that the code is organized in clusters of synonymous activity patterns that are similar in meaning but may differ considerably in their structure. This organization is highly reminiscent of the design of engineered codes. We suggest that the brain may use this structure and show how it allows accurate decoding of novel stimuli from novel spiking patterns.

  9. Bonsai: A GPU Tree-Code

    Science.gov (United States)

    Bédorf, J.; Gaburov, E.; Portegies Zwart, S.

    2012-07-01

    We present a gravitational hierarchical N-body code that is designed to run efficiently on Graphics Processing Units (GPUs). All parts of the algorithm are exectued on the GPU which eliminates the need for data transfer between the Central Processing Unit (CPU) and the GPU. Our tests indicate that the gravitational tree-code outperforms tuned CPU code for all parts of the algorithm and show an overall performance improvement of more than a factor 20, resulting in a processing rate of more than 2.8 million particles per second.

  10. Report on a workshop concerning code validation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    The design of wind turbine components is becoming more critical as turbines become lighter and more dynamically active. Computer codes that will reliably predict turbine dynamic response are, therefore, more necessary than before. However, predicting the dynamic response of very slender rotating structures that operate in turbulent winds is not a simple matter. Even so, codes for this purpose have been developed and tested in North America and in Europe, and it is important to disseminate information on this subject. The purpose of this workshop was to allow those involved in the wind energy industry in the US to assess the progress invalidation of the codes most commonly used for structural/aero-elastic wind turbine simulation. The theme of the workshop was, ``How do we know it`s right``? This was the question that participants were encouraged to ask themselves throughout the meeting in order to avoid the temptation of presenting information in a less-than-critical atmosphere. Other questions posed at the meeting are: What is the proof that the codes used can truthfully represent the field data? At what steps were the codes tested against known solutions, or against reliable field data? How should the designer or user validate results? What computer resources are needed? How do codes being used in Europe compare with those used in the US? How does the code used affect industry certification? What can be expected in the future?

  11. EMPIRE: A code for nuclear astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Palumbo, A. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2013-12-11

    The nuclear reaction code EMPIRE is presented as a useful tool for nuclear astrophysics. EMPIRE combines a variety of the reaction models with a comprehensive library of input parameters providing a diversity of options for the user. With exclusion of the directsemidirect capture all reaction mechanisms relevant to the nuclear astrophysics energy range of interest are implemented in the code. Comparison to experimental data show consistent agreement for all relevant channels.

  12. The Construction and Performance of a Novel Intergroup Complementary Code

    Directory of Open Access Journals (Sweden)

    Huang Wenzhun

    2013-09-01

    Full Text Available On the basis of the analyses for intergroup complementary (IGC code and zero correlation zone complementary code, a novel IGC code has been proposed to adapt M-ary orthogonal code spreading spectrum system or quasi-synchronous CDMA system. The definition and construction methods of the new IGC codes are presented and an applied example is given in this paper. Theoretical research and simulation results show that the main advantages of the novel IGC code are as following: The code sets of the novel IGC code is more than IGC code under the same code length. The zero correlation zone length is longer than the intergroup IGC code, but shorter than the intergroup IGC code. Under the same code length, the auto-correlation performance of the novel IGC code is better than that of the IGC code, and both are of similar cross-correlation performance.

  13. ARMANDO, a SPH code for CERN

    CERN Document Server

    Massidda, L

    2008-01-01

    The Smoothed Particle Hydrodynamics methodologies may be a useful numerical tool for the simulation of particle beam interaction with liquid targets and obstacles. ARMANDO code is a state of the art SPH code interfaced with FLUKA and capable to solve these problems. This report presents the basic theoretical elements behind the method, describes the most important aspects of the implementation and shows some simple examples.

  14. Code Parallelization with CAPO: A User Manual

    Science.gov (United States)

    Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.

  15. A Network Coding Approach to Loss Tomography

    DEFF Research Database (Denmark)

    Sattari, Pegah; Markopoulou, Athina; Fragouli, Christina

    2013-01-01

    multicast and/or unicast end-to-end probes. Independently, recent advances in network coding have shown that there are several advantages from allowing intermediate nodes to process and combine, in addition to just forward, packets. In this paper, we pose the problem of loss tomography in networks that have...... network coding capabilities. We design a framework for estimating link loss rates, which leverages network coding capabilities and we show that it improves several aspects of tomography, including the identifiability of links, the tradeoff between estimation accuracy and bandwidth efficiency...... and multiple paths between sources and receivers. This work was the first to make the connection between active network tomography and network coding, and thus opened a new research direction....

  16. A stromal address code defined by fibroblasts.

    Science.gov (United States)

    Parsonage, Greg; Filer, Andrew D; Haworth, Oliver; Nash, Gerard B; Rainger, G Ed; Salmon, Michael; Buckley, Christopher D

    2005-03-01

    To navigate into and within tissues, leukocytes require guidance cues that enable them to recognize which tissues to enter and which to avoid. Such cues are partly provided at the time of extravasation from blood by an endothelial address code on the luminal surface of the vascular endothelium. Here, we review the evidence that fibroblasts help define an additional stromal address code that directs leukocyte behaviour within tissues. We examine how this stromal code regulates site-specific leukocyte accumulation, differentiation and survival in a variety of physiological stromal niches, and how the aberrant expression of components of this code in the wrong tissue at the wrong time contributes to the persistence of chronic inflammatory diseases.

  17. Blockchain technology as a regulatory technology: From code is law to law is code

    OpenAIRE

    De Filippi, Primavera; Hassan, Samer

    2016-01-01

    Code is law” refers to the idea that, with the advent of digital technology, code has progressively established itself as the predominant way to regulate the behavior of Internet users. Yet, while computer code can enforce rules more efficiently than legal code, it also comes with a series of limitations, mostly because it is difficult to transpose the ambiguity and flexibility of legal rules into a formalized language which can be interpreted by a machine. With the advent of blockchain tech...

  18. Code-Mixing and Code-Switching of Indonesian Celebrities: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Nana Yuliana

    2015-05-01

    Full Text Available Foreign language skill presents a language variety called code-mixing and code-switching. The purpose of this study was to get some information to identify the types of code mixing and code switching frequently used by Indonesian celebrities. The study was divided into two groups. Group I was inclusive of the celebrities with native speakers parents and Group II comprised celebrities capable of speaking two or more languages. The qualitative and quantitative methods were used to analyze the code mixing and code switching with different frequency. It can be concluded that Group II use code-mixing and code-switching with a different frequency and speak foreign language more active.

  19. A concatenation scheme of LDPC codes and source codes for flash memories

    Science.gov (United States)

    Huang, Qin; Pan, Song; Zhang, Mu; Wang, Zulin

    2012-12-01

    Recently, low-density parity-check (LDPC) codes have been applied in flash memories to correct errors. However, as verified in this article, their performance degrades rapidly as the number of stuck cells increases. Thus, this paper presents a concatenation reliability scheme of LDPC codes and source codes, which aims to improve the performance of LDPC codes for flash memories with stuck cells. In this scheme, the locations of stuck cells is recorded by source codes in the write process such that erasures rather than wrong log-likelihood ratios on these cells are given in the read process. Then, LDPC codes correct these erasures and soft errors caused by cell-to-cell interferences. The analyses of channel capacity and compression rates of source codes with side information show that the memory cost of the proposed scheme is moderately low. Simulation results verify that the proposed scheme outperforms the traditional scheme with only LDPC codes.

  20. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  1. Theory of information and coding. A mathematical framework for communication

    Energy Technology Data Exchange (ETDEWEB)

    McEliece, R.J.

    1977-01-01

    This book is meant to be a self-contained introduction to the basic results in the theory of information and coding. The introduction gives an overview of the whole sujbect. Chapters in Part I (Information Theory) deal with entropy and mutual information, discrete memoryless channels and their capacity--cost functions, discrete memoryless sources and their rate-distortion functions, the Gaussian channel and source, the source--channel coding theorem, and advanced topics (the channel coding theorem, the source coding theorem). The chapters in Part II (Coding Theory) discuss linear codes; BCH, Goppa, and related codes; convolutional codes; variable-length source coding; and advanced topics (block codes, convolutional codes, a comparison of block and convolutional codes, source codes). 86 figures, 9 tables, 50 references. (RWR)

  2. DUNE - a granular flow code

    Energy Technology Data Exchange (ETDEWEB)

    Slone, D M; Cottom, T L; Bateson, W B

    2004-11-23

    DUNE was designed to accurately model the spectrum of granular. Granular flow encompasses the motions of discrete particles. The particles are macroscopic in that there is no Brownian motion. The flow can be thought of as a dispersed phase (the particles) interacting with a fluid phase (air or water). Validation of the physical models proceeds in tandem with simple experimental confirmation. The current development team is working toward the goal of building a flexible architecture where existing technologies can easily be integrated to further the capability of the simulation. We describe the DUNE architecture in some detail using physics models appropriate for an imploding liner experiment.

  3. Code Blue: a family matter?

    Science.gov (United States)

    Goforth, Rhonda

    2013-01-01

    The focus of this article is to encourage nurses and other healthcare staff to allow family members to be present during a resuscitation event. The author offers rationale, history, and simple guidelines for supporting families during this excruciating experience.

  4. A Relation Between Quasi-Cyclic Codes and 2-D Cyclic Codes

    OpenAIRE

    Güneri, Cem; Özbudak, Ferruh

    2011-01-01

    International audience; We consider a q-ary quasi-cyclic code C of length m' and index ', where both m and ' are relatively prime to q. If the constituents of C are cyclic codes, we show that C can also be viewed as a 2-D cyclic code of size m *l ' over Fq. If we further assume that m and ' are also coprime to each other, then we easily observe that the code C must be equivalent to a cyclic code. The last fact was proved earlier by Lim using a diff erent approach.

  5. Erasure Coded Storage on a Changing Network

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Venkat, Narayan; Oran, David

    2016-01-01

    a fixed repair mechanism or are constrained in the choice of repair strategies, therefore in theory benefit less from being network aware. We propose a general mechanism that explores the space of possible repairs and examine how much different types of erasure codes benefit by being network aware. We...... show significant gains for three erasure codes using both theoretical modeling and simulation results. We also consider the practical applicability of our proposed mechanism by limiting the search space to repairs that have the potential to be minimal cost and present a case study for RLNC, a class...

  6. Should managers have a code of conduct?

    Science.gov (United States)

    Bayliss, P

    1994-02-01

    Much attention is currently being given to values and ethics in the NHS. Issues of accountability are being explored as a consequence of the Cadbury report. The Institute of Health Services Management (IHSM) is considering whether managers should have a code of ethics. Central to this issue is what managers themselves think; the application of such a code may well stand or fall by whether managers are prepared to have ownership of it, and are prepared to make it work. Paul Bayliss reports on a survey of managers' views.

  7. Water cycle algorithm: A detailed standard code

    Science.gov (United States)

    Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.

  8. The politics of a European civil code

    NARCIS (Netherlands)

    Hesselink, M.W.

    2004-01-01

    Last year the European Commission published its Action Plan on European contract law. That plan forms an important step towards a European Civil Code. In its Plan, the Commission tries to depoliticise the codification process by asking a group of academic experts to prepare what it calls a 'common

  9. A Code of Ethics for Democratic Leadership

    Science.gov (United States)

    Molina, Ricardo; Klinker, JoAnn Franklin

    2012-01-01

    Democratic leadership rests on sacred values, awareness, judgement, motivation and courage. Four turning points in a 38-year school administrator's career revealed decision-making in problematic moments stemmed from values in a personal and professional code of ethics. Reflection on practice and theory added vocabulary and understanding to make…

  10. Coding Military Command as a Promiscuous Practice

    DEFF Research Database (Denmark)

    Ashcraft, Karen Lee; Muhr, Sara Louise

    2018-01-01

    metaphor as a consequential practice of leadership unto itself. Drawing on queer theory, the article develops a mode of analysis, called ‘promiscuous coding’, conducive to disrupting the gender divisions that currently anchor most leadership metaphors. Promiscuous coding can assist leadership scholars...

  11. Iterative Decoding of Concatenated Codes: A Tutorial

    Directory of Open Access Journals (Sweden)

    Phillip A. Regalia

    2005-05-01

    Full Text Available The turbo decoding algorithm of a decade ago constituted a milestone in error-correction coding for digital communications, and has inspired extensions to generalized receiver topologies, including turbo equalization, turbo synchronization, and turbo CDMA, among others. Despite an accrued understanding of iterative decoding over the years, the “turbo principle” remains elusive to master analytically, thereby inciting interest from researchers outside the communications domain. In this spirit, we develop a tutorial presentation of iterative decoding for parallel and serial concatenated codes, in terms hopefully accessible to a broader audience. We motivate iterative decoding as a computationally tractable attempt to approach maximum-likelihood decoding, and characterize fixed points in terms of a “consensus” property between constituent decoders. We review how the decoding algorithm for both parallel and serial concatenated codes coincides with an alternating projection algorithm, which allows one to identify conditions under which the algorithm indeed converges to a maximum-likelihood solution, in terms of particular likelihood functions factoring into the product of their marginals. The presentation emphasizes a common framework applicable to both parallel and serial concatenated codes.

  12. A Code Blue Answer to Training

    Science.gov (United States)

    Huneycutt, Richy; Callahan, Barbara; Welch, Alexis

    2008-01-01

    Code Blue addresses the capacity challenges in healthcare training. This pilot, grant funded project, focuses on a holistic approach to selecting and educating career ready and capable students and training them to be confident and competent healthcare workers. Lessons learned from this project will be assessed and reviewed for replication.

  13. FLUKA: A Multi-Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  14. CHEETAH: A next generation thermochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Fried, L.; Souers, P.

    1994-11-01

    CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0. We have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. We find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with the new standard run command. CHEETAH will make the use of thermochemical codes more attractive to practical explosive formulators. We have also made an extensive effort to improve over the results of TIGER. CHEETAH`s version of the BKW equation of state (BKWC) is able to accurately reproduce energies from cylinder tests; something that other BKW parameter sets have been unable to do. Calculations performed with BKWC execute very quickly; typical run times are under 10 seconds on a workstation. In the future we plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.

  15. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  16. TOCAR: a code to interface FOURACES - CARNAVAL

    Energy Technology Data Exchange (ETDEWEB)

    Panini, G.C.; Vaccari, M.

    1981-08-01

    The TOCAR code, written in FORTRAN-IV for IBM-370 computers, is an interface between the output of the FOURACES code and the CARNAVAL binary format for the multigroup neutron cross-sections, scattering matrices and related quantities. Besides the description of the code and the how to use, the report contains the code listing.

  17. When are Erasure Correcting Block Codes Better than Convolutional Codes in a Multi-hop Network?

    DEFF Research Database (Denmark)

    Hansen, Jonas; Østergaard, Jan; Kudahl, Johnny

    2017-01-01

    In this paper we investigate the effect of imposing a maximum allowed delay on the symbol loss probability for a set of rate 1/2 erasure correcting codes. Given some maximum allowable delay, we define the effective symbol loss probability to be the probability that a symbol is received too late...... block codes, and systematic convolutional codes. For a wide range of packet loss probabilities and allowable symbol delays, our results show that the systematic triangular block codes are superior. Our results also show that the field size does not affect the gain in effective symbol loss probability....

  18. Proposal for a minimal surface code experiment

    Science.gov (United States)

    Wootton, James R.; Peter, Andreas; Winkler, János R.; Loss, Daniel

    2017-09-01

    Current quantum technology is approaching the system sizes and fidelities required for quantum error correction. It is therefore important to determine exactly what is needed for proof-of-principle experiments, which will be a major step towards fault-tolerant quantum computation. Here we propose a surface code based experiment that is the smallest, both in terms of code size and circuit depth, that would allow errors to be detected and corrected for both the X and Z bases of a qubit. This requires 17 physical qubits initially prepared in a product state, on which 16 two-qubit entangling gates are applied before a final measurement of all qubits. A platform agnostic error model is applied to give some idea of the noise levels required for success. It is found that a true demonstration of quantum error correction will require fidelities for the preparation and measurement of qubits and the entangling gates to be above 99 % .

  19. Code White: A Signed Code Protection Mechanism for Smartphones

    Science.gov (United States)

    2010-09-01

    if(TheSuperPage().KernelConfigFlags() & EKernelConfigPlatSecProcessIsolation) { diff -r 2ee5df201f60 kernel/eka/memmodel/ epoc /multiple...mprocess.cpp --- a/kernel/eka/memmodel/ epoc /multiple/mprocess.cpp Mon Mar 08 11:58:34 2010 +0000 +++ b/kernel/eka/memmodel/ epoc /multiple/mprocess.cpp Thu

  20. CAFE: A New Relativistic MHD Code

    Science.gov (United States)

    Lora-Clavijo, F. D.; Cruz-Osorio, A.; Guzmán, F. S.

    2015-06-01

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin-Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin-Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  1. CAFE: A NEW RELATIVISTIC MHD CODE

    Energy Technology Data Exchange (ETDEWEB)

    Lora-Clavijo, F. D.; Cruz-Osorio, A. [Instituto de Astronomía, Universidad Nacional Autónoma de México, AP 70-264, Distrito Federal 04510, México (Mexico); Guzmán, F. S., E-mail: fdlora@astro.unam.mx, E-mail: aosorio@astro.unam.mx, E-mail: guzman@ifm.umich.mx [Instituto de Física y Matemáticas, Universidad Michoacana de San Nicolás de Hidalgo. Edificio C-3, Cd. Universitaria, 58040 Morelia, Michoacán, México (Mexico)

    2015-06-22

    We introduce CAFE, a new independent code designed to solve the equations of relativistic ideal magnetohydrodynamics (RMHD) in three dimensions. We present the standard tests for an RMHD code and for the relativistic hydrodynamics regime because we have not reported them before. The tests include the one-dimensional Riemann problems related to blast waves, head-on collisions of streams, and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the two-dimensional (2D) and 3D tests without magnetic field, we include the 2D Riemann problem, a one-dimensional shock tube along a diagonal, the high-speed Emery wind tunnel, the Kelvin–Helmholtz (KH) instability, a set of jets, and a 3D spherical blast wave, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion, a case of Kelvin–Helmholtz instability, and a 3D magnetic field advection loop. The code uses high-resolution shock-capturing methods, and we present the error analysis for a combination that uses the Harten, Lax, van Leer, and Einfeldt (HLLE) flux formula combined with a linear, piecewise parabolic method and fifth-order weighted essentially nonoscillatory reconstructors. We use the flux-constrained transport and the divergence cleaning methods to control the divergence-free magnetic field constraint.

  2. OSCAR a Matlab based optical FFT code

    Science.gov (United States)

    Degallaix, Jérôme

    2010-05-01

    Optical simulation softwares are essential tools for designing and commissioning laser interferometers. This article aims to introduce OSCAR, a Matlab based FFT code, to the experimentalist community. OSCAR (Optical Simulation Containing Ansys Results) is used to simulate the steady state electric fields in optical cavities with realistic mirrors. The main advantage of OSCAR over other similar packages is the simplicity of its code requiring only a short time to master. As a result, even for a beginner, it is relatively easy to modify OSCAR to suit other specific purposes. OSCAR includes an extensive manual and numerous detailed examples such as simulating thermal aberration, calculating cavity eigen modes and diffraction loss, simulating flat beam cavities and three mirror ring cavities. An example is also provided about how to run OSCAR on the GPU of modern graphic cards instead of the CPU, making the simulation up to 20 times faster.

  3. CAFE: A New Relativistic MHD Code

    CERN Document Server

    Lora-Clavijo, F D; Guzman, F S

    2014-01-01

    We present CAFE, a new independent code designed to solve the equations of Relativistic ideal Magnetohydrodynamics (RMHD) in 3D. We present the standard tests for a RMHD code and for the Relativistic Hydrodynamics (RMD) regime since we have not reported them before. The tests include the 1D Riemann problems related to blast waves, head-on collision of streams and states with transverse velocities, with and without magnetic field, which is aligned or transverse, constant or discontinuous across the initial discontinuity. Among the 2D tests, without magnetic field we include the 2D Riemann problem, the high speed Emery wind tunnel, the Kelvin-Helmholtz instability test and a set of jets, whereas in the presence of a magnetic field we show the magnetic rotor, the cylindrical explosion and the Kelvin-Helmholtz instability. The code uses High Resolution Shock Capturing methods and as a standard set up we present the error analysis with a simple combination that uses the HLLE flux formula combined with linear, PPM ...

  4. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  5. A mean field theory of coded CDMA systems

    Energy Technology Data Exchange (ETDEWEB)

    Yano, Toru [Graduate School of Science and Technology, Keio University, Hiyoshi, Kohoku-ku, Yokohama-shi, Kanagawa 223-8522 (Japan); Tanaka, Toshiyuki [Graduate School of Informatics, Kyoto University, Yoshida Hon-machi, Sakyo-ku, Kyoto-shi, Kyoto 606-8501 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)], E-mail: yano@thx.appi.keio.ac.jp

    2008-08-15

    We present a mean field theory of code-division multiple-access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean-field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.

  6. FCG: a code generator for lazy functional languages

    NARCIS (Netherlands)

    Kastens, U.; Langendoen, K.G.; Hartel, Pieter H.; Pfahler, P.

    1992-01-01

    The FCGcode generator produces portable code that supports efficient two-space copying garbage collection. The code generator transforms the output of the FAST compiler front end into an abstract machine code. This code explicitly uses a call stack, which is accessible to the garbage collector. In

  7. A novel orientation code for face recognition

    Science.gov (United States)

    Zheng, Yufeng

    2011-06-01

    A novel orientation code is proposed for face recognition applications in this paper. Gabor wavelet transform is a common tool for orientation analysis in a 2D image; whereas Hamming distance is an efficient distance measurement for multiple classifications such as face identification. Specifically, at each frequency band, an index number representing the strongest orientational response is selected, and then encoded in binary format to favor the Hamming distance calculation. Multiple-band orientation codes are then organized into a face pattern byte (FPB) by using order statistics. With the FPB, Hamming distances are calculated and compared to achieve face identification. The FPB has the dimensionality of 8 bits per pixel and its performance will be compared to that of FPW (face pattern word, 32 bits per pixel). The dimensionality of FPB can be further reduced down to 4 bits per pixel, called face pattern nibble (FPN). Experimental results with visible and thermal face databases show that the proposed orientation code for face recognition is very promising in contrast with classical methods such as PCA.

  8. A Construction of MDS Quantum Convolutional Codes

    Science.gov (United States)

    Zhang, Guanghui; Chen, Bocong; Li, Liangchen

    2015-09-01

    In this paper, two new families of MDS quantum convolutional codes are constructed. The first one can be regarded as a generalization of [36, Theorem 6.5], in the sense that we do not assume that q≡1 (mod 4). More specifically, we obtain two classes of MDS quantum convolutional codes with parameters: (i) [( q 2+1, q 2-4 i+3,1;2,2 i+2)] q , where q≥5 is an odd prime power and 2≤ i≤( q-1)/2; (ii) , where q is an odd prime power with the form q=10 m+3 or 10 m+7 ( m≥2), and 2≤ i≤2 m-1.

  9. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  10. A Contribution Towards A Grammar of Code

    Directory of Open Access Journals (Sweden)

    David M. Berry

    2008-01-01

    Full Text Available Over the past thirty years there has been an increasing interest in the social and cultural implications of digital technologies and ‘informationalism’ from the social sciences and humanities. Generally this has concentrated on the implications of the “convergence” of digital devices and services, understood as linked to the discrete processing capabilities of computers, which rely on logical operations, binary processing and symbolic representation. In this paper, I wish to suggest that a ‘grammar of code’ might provide a useful way of thinking about the way in which digital technologies operate as a medium and can contribute usefully to this wider debate. I am interested in the way in which the dynamic properties of code can be understood as operating according to a grammar reflected in its materialisation and operation in the lifeworld – the discretisation of the phenomenal world. As part of that contribution in this paper I develop some tentative Weberian ‘ideal-types’. These ideal-types are then applied to the work of the Japanese composer, Masahiro Miwa, whose innovative ‘Reverse-Simulation music’ models the operation of basic low-level digital circuitry for the performance and generation of musical pieces.

  11. Improved decoding for a concatenated coding system

    DEFF Research Database (Denmark)

    Paaske, Erik

    1990-01-01

    The concatenated coding system recommended by CCSDS (Consultative Committee for Space Data Systems) uses an outer (255,233) Reed-Solomon (RS) code based on 8-b symbols, followed by the block interleaver and an inner rate 1/2 convolutional code with memory 6. Viterbi decoding is assumed. Two new...

  12. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    Science.gov (United States)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  13. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    Energy Technology Data Exchange (ETDEWEB)

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.

  14. On a Mathematical Theory of Coded Exposure

    Science.gov (United States)

    2014-08-01

    1 |v| α v ũ pnq pγpx nqq2 dx. (61) Hence, it follows from (61) and Fubini theorem (see, e.g., [44, p. 196]) that » 8 8...prove a theorem that gives closed formulae for the MSE and SNR of coded exposure cameras (see theorem 3.8 page 23). To the best of our knowledge, the...photon emission µ doubles then the SNR is multiplied by a factor ? 2. (And we retrieve the fundamental theorem of photography.) Note that if we have no

  15. A surface code quantum computer in silicon

    Science.gov (United States)

    Hill, Charles D.; Peretz, Eldad; Hile, Samuel J.; House, Matthew G.; Fuechsle, Martin; Rogge, Sven; Simmons, Michelle Y.; Hollenberg, Lloyd C. L.

    2015-01-01

    The exceptionally long quantum coherence times of phosphorus donor nuclear spin qubits in silicon, coupled with the proven scalability of silicon-based nano-electronics, make them attractive candidates for large-scale quantum computing. However, the high threshold of topological quantum error correction can only be captured in a two-dimensional array of qubits operating synchronously and in parallel—posing formidable fabrication and control challenges. We present an architecture that addresses these problems through a novel shared-control paradigm that is particularly suited to the natural uniformity of the phosphorus donor nuclear spin qubit states and electronic confinement. The architecture comprises a two-dimensional lattice of donor qubits sandwiched between two vertically separated control layers forming a mutually perpendicular crisscross gate array. Shared-control lines facilitate loading/unloading of single electrons to specific donors, thereby activating multiple qubits in parallel across the array on which the required operations for surface code quantum error correction are carried out by global spin control. The complexities of independent qubit control, wave function engineering, and ad hoc quantum interconnects are explicitly avoided. With many of the basic elements of fabrication and control based on demonstrated techniques and with simulated quantum operation below the surface code error threshold, the architecture represents a new pathway for large-scale quantum information processing in silicon and potentially in other qubit systems where uniformity can be exploited. PMID:26601310

  16. A surface code quantum computer in silicon.

    Science.gov (United States)

    Hill, Charles D; Peretz, Eldad; Hile, Samuel J; House, Matthew G; Fuechsle, Martin; Rogge, Sven; Simmons, Michelle Y; Hollenberg, Lloyd C L

    2015-10-01

    The exceptionally long quantum coherence times of phosphorus donor nuclear spin qubits in silicon, coupled with the proven scalability of silicon-based nano-electronics, make them attractive candidates for large-scale quantum computing. However, the high threshold of topological quantum error correction can only be captured in a two-dimensional array of qubits operating synchronously and in parallel-posing formidable fabrication and control challenges. We present an architecture that addresses these problems through a novel shared-control paradigm that is particularly suited to the natural uniformity of the phosphorus donor nuclear spin qubit states and electronic confinement. The architecture comprises a two-dimensional lattice of donor qubits sandwiched between two vertically separated control layers forming a mutually perpendicular crisscross gate array. Shared-control lines facilitate loading/unloading of single electrons to specific donors, thereby activating multiple qubits in parallel across the array on which the required operations for surface code quantum error correction are carried out by global spin control. The complexities of independent qubit control, wave function engineering, and ad hoc quantum interconnects are explicitly avoided. With many of the basic elements of fabrication and control based on demonstrated techniques and with simulated quantum operation below the surface code error threshold, the architecture represents a new pathway for large-scale quantum information processing in silicon and potentially in other qubit systems where uniformity can be exploited.

  17. A construction of quantum turbo product codes based on CSS-type quantum convolutional codes

    Science.gov (United States)

    Xiao, Hailin; Ni, Ju; Xie, Wu; Ouyang, Shan

    As in classical coding theory, turbo product codes (TPCs) through serially concatenated block codes can achieve approximatively Shannon capacity limit and have low decoding complexity. However, special requirements in the quantum setting severely limit the structures of turbo product codes (QTPCs). To design a good structure for QTPCs, we present a new construction of QTPCs with the interleaved serial concatenation of CSS(L1,L2)-type quantum convolutional codes (QCCs). First, CSS(L1,L2)-type QCCs are proposed by exploiting the theory of CSS-type quantum stabilizer codes and QCCs, and the description and the analysis of encoder circuit are greatly simplified in the form of Hadamard gates and C-NOT gates. Second, the interleaved coded matrix of QTPCs is derived by quantum permutation SWAP gate definition. Finally, we prove the corresponding relation on the minimum Hamming distance of QTPCs associated with classical TPCs, and describe the state diagram of encoder and decoder of QTPCs that have a highly regular structure and simple design idea.

  18. A Construction of Lossy Source Code Using LDPC Matrices

    Science.gov (United States)

    Miyake, Shigeki; Muramatsu, Jun

    Research into applying LDPC code theory, which is used for channel coding, to source coding has received a lot of attention in several research fields such as distributed source coding. In this paper, a source coding problem with a fidelity criterion is considered. Matsunaga et al. and Martinian et al. constructed a lossy code under the conditions of a binary alphabet, a uniform distribution, and a Hamming measure of fidelity criterion. We extend their results and construct a lossy code under the extended conditions of a binary alphabet, a distribution that is not necessarily uniform, and a fidelity measure that is bounded and additive and show that the code can achieve the optimal rate, rate-distortion function. By applying a formula for the random walk on lattice to the analysis of LDPC matrices on Zq, where q is a prime number, we show that results similar to those for the binary alphabet condition hold for Zq, the multiple alphabet condition.

  19. A Fast and Efficient Topological Coding Algorithm for Compound Images

    Directory of Open Access Journals (Sweden)

    Xin Li

    2003-11-01

    Full Text Available We present a fast and efficient coding algorithm for compound images. Unlike popular mixture raster content (MRC based approaches, we propose to attack compound image coding problem from the perspective of modeling location uncertainty of image singularities. We suggest that a computationally simple two-class segmentation strategy is sufficient for the coding of compound images. We argue that jointly exploiting topological properties of image source in classification and coding stages is beneficial to the robustness of compound image coding systems. Experiment results have justified effectiveness and robustness of the proposed topological coding algorithm.

  20. A graph model for opportunistic network coding

    KAUST Repository

    Sorour, Sameh

    2015-08-12

    © 2015 IEEE. Recent advancements in graph-based analysis and solutions of instantly decodable network coding (IDNC) trigger the interest to extend them to more complicated opportunistic network coding (ONC) scenarios, with limited increase in complexity. In this paper, we design a simple IDNC-like graph model for a specific subclass of ONC, by introducing a more generalized definition of its vertices and the notion of vertex aggregation in order to represent the storage of non-instantly-decodable packets in ONC. Based on this representation, we determine the set of pairwise vertex adjacency conditions that can populate this graph with edges so as to guarantee decodability or aggregation for the vertices of each clique in this graph. We then develop the algorithmic procedures that can be applied on the designed graph model to optimize any performance metric for this ONC subclass. A case study on reducing the completion time shows that the proposed framework improves on the performance of IDNC and gets very close to the optimal performance.

  1. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  2. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  3. A Realistic Model Under Which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, Harry; van der Gulik, Peter T. S.; Klau, Gunnar W.; Schaffner, Christian; Speijer, Dave; Stougie, Leen

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  4. A matrix ring description for cyclic convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, Heide.; Tsang, Fai. -Lung.

    In this paper, we study convolutional codes with a specific cyclic structure. By definition, these codes are left ideals in a certain skew polynomial ring. Using that the skew polynomial ring is isomorphic to a matrix ring we can describe the algebraic parameters of the codes in a more accessible

  5. A decoding method of an n length binary BCH code through (n + 1n length binary cyclic code

    Directory of Open Access Journals (Sweden)

    TARIQ SHAH

    2013-09-01

    Full Text Available For a given binary BCH code Cn of length n = 2 s - 1 generated by a polynomial of degree r there is no binary BCH code of length (n + 1n generated by a generalized polynomial of degree 2r. However, it does exist a binary cyclic code C (n+1n of length (n + 1n such that the binary BCH code Cn is embedded in C (n+1n . Accordingly a high code rate is attained through a binary cyclic code C (n+1n for a binary BCH code Cn . Furthermore, an algorithm proposed facilitates in a decoding of a binary BCH code Cn through the decoding of a binary cyclic code C (n+1n , while the codes Cn and C (n+1n have the same minimum hamming distance.

  6. Encoding of line drawings with a multiple grid chain code.

    Science.gov (United States)

    Minami, T; Shinohara, K

    1986-02-01

    The multiple grid(MG) chain code which uses four different square grids is proposed to encode line drawings. The main processes adopted in the code are: 1) a grid selection algorithm which allocates quantization points only to the vicinity of the course of a line drawing, 2) labeling rule on quantization points which makes the frequency of some codes larger than that of other codes, and 3) quantization points allocation-not to the corners, but to the sides of a square which makes the straight line segments larger without increasing quantization error. A performance comparison of various chain codes is made from the viewpoints of the encoding efficiency, naturalness of the encoded lines, and the rate distortion measure. Also, the superiority of the MG chain code to other codes is shown. At last, application of the MG chain code to the electronic blackboard system is explained.

  7. A variant of list plus CRC concatenated polar code

    OpenAIRE

    Bonik, Gregory; Goreinov, Sergei; Zamarashkin, Nickolai

    2012-01-01

    A new family of codes based on polar codes, soft concatenation and list+CRC decoding is proposed. Numerical experiments show the performance competitive with industry standards and Tal, Vardy approach.

  8. Coupling a Basin Modeling and a Seismic Code using MOAB

    KAUST Repository

    Yan, Mi

    2012-06-02

    We report on a demonstration of loose multiphysics coupling between a basin modeling code and a seismic code running on a large parallel machine. Multiphysics coupling, which is one critical capability for a high performance computing (HPC) framework, was implemented using the MOAB open-source mesh and field database. MOAB provides for code coupling by storing mesh data and input and output field data for the coupled analysis codes and interpolating the field values between different meshes used by the coupled codes. We found it straightforward to use MOAB to couple the PBSM basin modeling code and the FWI3D seismic code on an IBM Blue Gene/P system. We describe how the coupling was implemented and present benchmarking results for up to 8 racks of Blue Gene/P with 8192 nodes and MPI processes. The coupling code is fast compared to the analysis codes and it scales well up to at least 8192 nodes, indicating that a mesh and field database is an efficient way to implement loose multiphysics coupling for large parallel machines.

  9. A virtual retina for studying population coding.

    Directory of Open Access Journals (Sweden)

    Illya Bomash

    Full Text Available At every level of the visual system - from retina to cortex - information is encoded in the activity of large populations of cells. The populations are not uniform, but contain many different types of cells, each with its own sensitivities to visual stimuli. Understanding the roles of the cell types and how they work together to form collective representations has been a long-standing goal. This goal, though, has been difficult to advance, and, to a large extent, the reason is data limitation. Large numbers of stimulus/response relationships need to be explored, and obtaining enough data to examine even a fraction of them requires a great deal of experiments and animals. Here we describe a tool for addressing this, specifically, at the level of the retina. The tool is a data-driven model of retinal input/output relationships that is effective on a broad range of stimuli - essentially, a virtual retina. The results show that it is highly reliable: (1 the model cells carry the same amount of information as their real cell counterparts, (2 the quality of the information is the same - that is, the posterior stimulus distributions produced by the model cells closely match those of their real cell counterparts, and (3 the model cells are able to make very reliable predictions about the functions of the different retinal output cell types, as measured using Bayesian decoding (electrophysiology and optomotor performance (behavior. In sum, we present a new tool for studying population coding and test it experimentally. It provides a way to rapidly probe the actions of different cell classes and develop testable predictions. The overall aim is to build constrained theories about population coding and keep the number of experiments and animals to a minimum.

  10. A simple numerical coding system for clinical electrocardiography

    NARCIS (Netherlands)

    Robles de Medina, E.O.; Meijler, F.L.

    1974-01-01

    A simple numerical coding system for clinical electrocardiography has been developed. This system enables the storage in coded form of the ECG analysis. The code stored on a digital magnetic tape can be used for a computer print-out of the analysis, while the information can be retrieved at any time

  11. Studying genetic code by a matrix approach.

    Science.gov (United States)

    Crowder, Tanner; Li, Chi-Kwong

    2010-05-01

    Following Petoukhov and his collaborators, we use two length n zero-one sequences, alpha and beta, to represent a length n genetic sequence (alpha/beta) so that the columns of (alpha/beta) have the following correspondence with the nucleotides: C ~ (0/0), U ~ (1/0), G ~ (1/1), A ~ (0/1). Using the Gray code ordering to arrange alpha and beta, we build a 2(n) x 2(n) matrix C(n) including all the 4(n) length n genetic sequences. Furthermore, we use the Hamming distance of alpha and beta to construct a 2(n) x 2(n) matrix D(n). We explore structures of these matrices, refine the results in earlier papers, and propose new directions for further research.

  12. Developing coding skills: technology for a flipped classroom

    OpenAIRE

    Martins, Ciro; Marques, Fábio; Balula, Ana

    2016-01-01

    Knowing how to code is critical nowadays, and coding skills are treasured in a growing range of diversified areas. This led to the emerging of several learning platforms that allow students with different kind of knowledge to develop their coding skills. These platforms provide active and fun ways to learn how to code, using technology to create controlled, practical learning and teaching experiences both for students and teachers. The whole idea of this...

  13. Toward a Code of Conduct for Graduate Education

    Science.gov (United States)

    Proper, Eve

    2012-01-01

    Most academic disciplines promulgate codes of ethics that serve as public statements of professional norms of their membership. These codes serve both symbolic and practical purposes, stating to both members and the larger public what a discipline's highest ethics are. This article explores what scholarly society codes of ethics could say about…

  14. A Proposed Code Of Ethics For Infrared Thermographic Professionals

    Science.gov (United States)

    Roberts, Charles C.

    1987-05-01

    The American Heritage Dictionary defines ethics as "The general study of morals and of specific moral choices to be made by the individual in his relationship with others". A code of ethics defines these moral relationships to encourage integrity throughout a profession. A defined code of ethics often yields credibility to an organization or association of professionals. This paper outlines a proposed code of ethics for practitioners in the infrared thermographic field. The proposed code covers relationships with the public, clients, other professionals and employers. The proposed code covers credentials, capabilities, thermograms, compensation and safety.

  15. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  16. An improved canine genome and a comprehensive catalogue of coding genes and non-coding transcripts.

    Directory of Open Access Journals (Sweden)

    Marc P Hoeppner

    Full Text Available The domestic dog, Canis familiaris, is a well-established model system for mapping trait and disease loci. While the original draft sequence was of good quality, gaps were abundant particularly in promoter regions of the genome, negatively impacting the annotation and study of candidate genes. Here, we present an improved genome build, canFam3.1, which includes 85 MB of novel sequence and now covers 99.8% of the euchromatic portion of the genome. We also present multiple RNA-Sequencing data sets from 10 different canine tissues to catalog ∼175,000 expressed loci. While about 90% of the coding genes previously annotated by EnsEMBL have measurable expression in at least one sample, the number of transcript isoforms detected by our data expands the EnsEMBL annotations by a factor of four. Syntenic comparison with the human genome revealed an additional ∼3,000 loci that are characterized as protein coding in human and were also expressed in the dog, suggesting that those were previously not annotated in the EnsEMBL canine gene set. In addition to ∼20,700 high-confidence protein coding loci, we found ∼4,600 antisense transcripts overlapping exons of protein coding genes, ∼7,200 intergenic multi-exon transcripts without coding potential, likely candidates for long intergenic non-coding RNAs (lincRNAs and ∼11,000 transcripts were reported by two different library construction methods but did not fit any of the above categories. Of the lincRNAs, about 6,000 have no annotated orthologs in human or mouse. Functional analysis of two novel transcripts with shRNA in a mouse kidney cell line altered cell morphology and motility. All in all, we provide a much-improved annotation of the canine genome and suggest regulatory functions for several of the novel non-coding transcripts.

  17. Codes and morals: is there a missing link? (The Nuremberg Code revisited).

    Science.gov (United States)

    Hick, C

    1998-01-01

    Codes are a well known and popular but weak form of ethical regulation in medical practice. There is, however, a lack of research on the relations between moral judgments and ethical Codes, or on the possibility of morally justifying these Codes. Our analysis begins by showing, given the Nuremberg Code, how a typical reference to natural law has historically served as moral justification. We then indicate, following the analyses of H. T. Engelhardt, Jr., and A. MacIntyre, why such general moral justifications of codes must necessarily fail in a society of "moral strangers." Going beyond Engelhardt we argue, that after the genealogical suspicion in morals raised by Nietzsche, not even Engelhardt's "principle of permission" can be rationally justified in a strong sense--a problem of transcendental argumentation in morals already realized by I. Kant. Therefore, we propose to abandon the project of providing general justifications for moral judgements and to replace it with a hermeneutical analysis of ethical meanings in real-world situations, starting with the archetypal ethical situation, the encounter with the Other (E. Levinas).

  18. A proposal for a code of ethics for nurse practitioners.

    Science.gov (United States)

    Peterson, Moya; Potter, Robert Lyman

    2004-03-01

    To review established codes for health care professionals and standards of practice for the nurse practitioner (NP) and to utilize these codes and standards, general ethical themes, and a new ethical triangle to propose an ethical code for NPs. Reviews of three generally accepted ethical themes (deontological, teleological, and areteological), the ethical triangle by Potter, the American Academy of Nurse Practitioners (AANP) standards of practice for NPs, and codes of ethics from the American Nurses Association (ANA) and the American Medical Association (AMA). A proposal for a code of ethics for NPs is presented. This code was determined by basic ethical themes and established codes for nursing, formulated by the ANA, and for physicians, formulated by the AMA. The proposal was also developed in consideration of the AANP standards of practice for NPs. The role of the NP is unique in its ethical demands. The authors believe that the expanded practice of NPs presents ethical concerns that are not addressed by the ANA code and yet are relevant to nursing and therefore different than the ethical concerns of physicians. This proposal attempts to broaden NPs' perspective of the role that ethics should hold in their professional lives.

  19. A reflexive exploration of two qualitative data coding techniques

    Directory of Open Access Journals (Sweden)

    Erik Blair

    2016-01-01

    Full Text Available In an attempt to help find meaning within qualitative data, researchers commonly start by coding their data. There are a number of coding systems available to researchers and this reflexive account explores my reflections on the use of two such techniques. As part of a larger investigation, two pilot studies were undertaken as a means to examine the relative merits of open coding and template coding for examining transcripts. This article does not describe the research project per se but attempts to step back and offer a reflexive account of the development of data coding tools. Here I reflect upon and evaluate the two data coding techniques that were piloted, and discuss how using appropriate aspects of both led to the development of my final data coding approach. My exploration found there was no clear-cut ‘best’ option but that the data coding techniques needed to be reflexively-aligned to meet the specific needs of my project. This reflection suggests that, when coding qualitative data, researchers should be methodologically thoughtful when they attempt to apply any data coding technique; that they do not assume pre-established tools are aligned to their particular paradigm; and that they consider combining and refining established techniques as a means to define their own specific codes. DOI: 10.2458/azu_jmmss.v6i1.18772DOI: 10.2458/azu_jmmss.v6i1.18772

  20. CONSTRUCTION OF REGULAR LDPC LIKE CODES BASED ON FULL RANK CODES AND THEIR ITERATIVE DECODING USING A PARITY CHECK TREE

    Directory of Open Access Journals (Sweden)

    H. Prashantha Kumar

    2011-09-01

    Full Text Available Low density parity check (LDPC codes are capacity-approaching codes, which means that practical constructions exist that allow the noise threshold to be set very close to the theoretical Shannon limit for a memory less channel. LDPC codes are finding increasing use in applications like LTE-Networks, digital television, high density data storage systems, deep space communication systems etc. Several algebraic and combinatorial methods are available for constructing LDPC codes. In this paper we discuss a novel low complexity algebraic method for constructing regular LDPC like codes derived from full rank codes. We demonstrate that by employing these codes over AWGN channels, coding gains in excess of 2dB over un-coded systems can be realized when soft iterative decoding using a parity check tree is employed.

  1. A code inspection process for security reviews

    Science.gov (United States)

    Garzoglio, Gabriele

    2010-04-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  2. A new type of alternating code for incoherent scatter measurements

    Science.gov (United States)

    Sulzer, Michael P.

    1993-11-01

    An alternating code set is employed as one of several possible techniques used in incoherent scatter radar transmissions to obtain ambiguity-free measurements of autocorrelation functions or spectra with good range resolution. An alternating code set consists of several codes; typically, each successive radar pulse is modulated with a different code in the sequence. This technique is useful in other types of radar transmissions when the target is overspread, assuming the targets have certain statistical properties. Code sets for a new type of alternating code are presented for code lengths 8-12. This new type of alternating code differs from the first kind in two ways: it is subject to a slightly different condition for the elimination of ambiguity, and it is not restricted to lengths that are powers of 2. The new lengths are useful because they allow greater freedom in designing a multipurpose radar waveform best utilizing the available duty cycle of the radar. The alternating code technique is described in detail sufficient to allow an understanding of the two types and to show that the new condition for ambiguity-free measurements is a useful one. A search program was used to find the new sets; the aspects of the program important for decreasing the size of the search space are described. The code sets are presented, and their significance and uses are discussed.

  3. RAYS: a geometrical optics code for EBT

    Energy Technology Data Exchange (ETDEWEB)

    Batchelor, D.B.; Goldfinger, R.C.

    1982-04-01

    The theory, structure, and operation of the code are described. Mathematical details of equilibrium subroutiones for slab, bumpy torus, and tokamak plasma geometry are presented. Wave dispersion and absorption subroutines are presented for frequencies ranging from ion cyclotron frequency to electron cyclotron frequency. Graphics postprocessors for RAYS output data are also described.

  4. A continuous curriculum for building code blue competency.

    Science.gov (United States)

    Jankouskas, T S

    2001-01-01

    Staff response to a code blue can be cultivated when a continuous curriculum for code blue competency is used. This curriculum details ongoing code blue education yet requires only small increments of time for the inservice classes. The curriculum, adaptable to any unit, consists of three components: a unit-specific orientation to emergency equipment; exercises in critical thinking and doing; and exercises in documentation.

  5. Continuous Materiality: Through a Hierarchy of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jichen Zhu

    2008-01-01

    Full Text Available The legacy of Cartesian dualism inherent in linguistic theory deeply influences current views on the relation between natural language, computer code, and the physical world. However, the oversimplified distinction between mind and body falls short of capturing the complex interaction between the material and the immaterial. In this paper, we posit a hierarchy of codes to delineate a wide spectrum of continuous materiality. Our research suggests that diagrams in architecture provide a valuable analog for approaching computer code in emergent digital systems. After commenting on ways that Cartesian dualism continues to haunt discussions of code, we turn our attention to diagrams and design morphology. Finally we notice the implications a material understanding of code bears for further research on the relation between human cognition and digital code. Our discussion concludes by noticing several areas that we have projected for ongoing research.

  6. Optix: A Monte Carlo scintillation light transport code

    Energy Technology Data Exchange (ETDEWEB)

    Safari, M.J., E-mail: mjsafari@aut.ac.ir [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Afarideh, H. [Department of Energy Engineering and Physics, Amir Kabir University of Technology, PO Box 15875-4413, Tehran (Iran, Islamic Republic of); Ghal-Eh, N. [School of Physics, Damghan University, PO Box 36716-41167, Damghan (Iran, Islamic Republic of); Davani, F. Abbasi [Nuclear Engineering Department, Shahid Beheshti University, PO Box 1983963113, Tehran (Iran, Islamic Republic of)

    2014-02-11

    The paper reports on the capabilities of Monte Carlo scintillation light transport code Optix, which is an extended version of previously introduced code Optics. Optix provides the user a variety of both numerical and graphical outputs with a very simple and user-friendly input structure. A benchmarking strategy has been adopted based on the comparison with experimental results, semi-analytical solutions, and other Monte Carlo simulation codes to verify various aspects of the developed code. Besides, some extensive comparisons have been made against the tracking abilities of general-purpose MCNPX and FLUKA codes. The presented benchmark results for the Optix code exhibit promising agreements. -- Highlights: • Monte Carlo simulation of scintillation light transport in 3D geometry. • Evaluation of angular distribution of detected photons. • Benchmark studies to check the accuracy of Monte Carlo simulations.

  7. ETFOD: a point model physics code with arbitrary input

    Energy Technology Data Exchange (ETDEWEB)

    Rothe, K.E.; Attenberger, S.E.

    1980-06-01

    ETFOD is a zero-dimensional code which solves a set of physics equations by minimization. The technique used is different than normally used, in that the input is arbitrary. The user is supplied with a set of variables from which he specifies which variables are input (unchanging). The remaining variables become the output. Presently the code is being used for ETF reactor design studies. The code was written in a manner to allow easy modificaton of equations, variables, and physics calculations. The solution technique is presented along with hints for using the code.

  8. How Moral Codes Evolve in a Trust Game

    Directory of Open Access Journals (Sweden)

    Jean Paul Rabanal

    2015-06-01

    Full Text Available This paper analyzes the dynamic stability of moral codes in a two population trust game. Guided by a moral code, members of one population, the Trustors, are willing to punish members of the other population, the Trustees, who defect. Under replicator dynamics, adherence to the moral code has unstable oscillations around an interior Nash Equilibrium (NE, but under smoothed best response dynamics we obtain convergence to Quantal Response Equilibrium (QRE.

  9. LOLA SYSTEM: A code block for nodal PWR simulation. Part. I - Simula-3 Code

    Energy Technology Data Exchange (ETDEWEB)

    Aragones, J. M.; Ahnert, C.; Gomez Santamaria, J.; Rodriguez Olabarria, I.

    1985-07-01

    Description of the theory and users manual of the SIMULA-3 code, which is part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. SIMULA-3 is the main module of the system, it uses a modified nodal theory, with interface leakages equivalent to the diffusion theory. (Author) 4 refs.

  10. CALMAR: A New Versatile Code Library for Adjustment from Measurements

    Directory of Open Access Journals (Sweden)

    Grégoire G.

    2016-01-01

    Full Text Available CALMAR, a new library for adjustment has been developed. This code performs simultaneous shape and level adjustment of an initial prior spectrum from measured reactions rates of activation foils. It is written in C++ using the ROOT data analysis framework,with all linear algebra classes. STAYSL code has also been reimplemented in this library. Use of the code is very flexible : stand-alone, inside a C++ code, or driven by scripts. Validation and test cases are under progress. Theses cases will be included in the code package that will be available to the community. Future development are discussed. The code should support the new Generalized Nuclear Data (GND format. This new format has many advantages compared to ENDF.

  11. A standardised language code for rail freight operations

    Directory of Open Access Journals (Sweden)

    Marin MARINOV

    2012-01-01

    Full Text Available In this paper a standardised language code for rail freight operations is developed that aims to facilitate communication between dispatchers, train drivers and traffic managers when running freight trains abroad. In developing the language code the existing situation in the bi-lingual Belgian railways has been taken as a starting point. Communication codes from public services like police, fire brigade, ambulance and military services have been studied. It is believed that, if implemented, the code will improve significantly integrated rail freight services in Europe.

  12. Toward a unified theory of efficient, predictive, and sparse coding.

    Science.gov (United States)

    Chalk, Matthew; Marre, Olivier; Tkačik, Gašper

    2018-01-02

    A central goal in theoretical neuroscience is to predict the response properties of sensory neurons from first principles. To this end, "efficient coding" posits that sensory neurons encode maximal information about their inputs given internal constraints. There exist, however, many variants of efficient coding (e.g., redundancy reduction, different formulations of predictive coding, robust coding, sparse coding, etc.), differing in their regimes of applicability, in the relevance of signals to be encoded, and in the choice of constraints. It is unclear how these types of efficient coding relate or what is expected when different coding objectives are combined. Here we present a unified framework that encompasses previously proposed efficient coding models and extends to unique regimes. We show that optimizing neural responses to encode predictive information can lead them to either correlate or decorrelate their inputs, depending on the stimulus statistics; in contrast, at low noise, efficiently encoding the past always predicts decorrelation. Later, we investigate coding of naturalistic movies and show that qualitatively different types of visual motion tuning and levels of response sparsity are predicted, depending on whether the objective is to recover the past or predict the future. Our approach promises a way to explain the observed diversity of sensory neural responses, as due to multiple functional goals and constraints fulfilled by different cell types and/or circuits.

  13. What's in a code? Towards a formal account of the relation of ontologies and coding systems.

    Science.gov (United States)

    Rector, Alan L

    2007-01-01

    Terminologies are increasingly based on "ontologies" developed in description logics and related languages such as the new Web Ontology Language, OWL. The use of description logic has been expected to reduce ambiguity and make it easier determine logical equivalence, deal with negation, and specify EHRs. However, this promise has not been fully realised: in part because early description logics were relatively inexpressive, in part, because the relation between coding systems, EHRs, and ontologies expressed in description logics has not been fully understood. This paper presents a unifying approach using the expressive formalisms available in the latest version of OWL, OWL 1.1.

  14. CODE SWITCHING: A VARIATION IN LANGUAGE USE. Jane ...

    African Journals Online (AJOL)

    MISS NWOBU

    “conversational implicature” which sees verbal exchanges or communication as a cooperative enterprise in ... use on any occasion is a code, a system used for communication between two or more parties”. Code switching is a ... to define their similarities and differences from other types of language variation that are part of.

  15. A Framework for Retargetable Code Generation using Simulated Annealing

    NARCIS (Netherlands)

    Visser, B.S.

    2000-01-01

    embedded systems. Retargetable code generation is a co-designing method to map a high-level software description onto a variety of hardware architectures without the need to rewrite a compiler. Highly efficient code generation is required to meet, for example, timing, area and low-power constraints.

  16. IGB grid: User's manual (A turbomachinery grid generation code)

    Science.gov (United States)

    Beach, T. A.; Hoffman, G.

    1992-01-01

    A grid generation code called IGB is presented for use in computational investigations of turbomachinery flowfields. It contains a combination of algebraic and elliptic techniques coded for use on an interactive graphics workstation. The instructions for use and a test case are included.

  17. Mock Code: A Code Blue Scenario Requested by and Developed for Registered Nurses.

    Science.gov (United States)

    Williams, Kerry-Lynn; Rideout, Janice; Pritchett-Kelly, Sherry; McDonald, Melissa; Mullins-Richards, Paula; Dubrowski, Adam

    2016-12-23

    The use of simulation in medical training is quickly becoming more common, with applications in emergency, surgical, and nursing education. Recently, registered nurses working in surgical inpatient units requested a mock code simulation to practice skills, improve knowledge, and build self-confidence in a safe and controlled environment. A simulation scenario using a high-fidelity mannequin was developed and will be discussed herein.

  18. A coding solution for supernumerary teeth.

    Science.gov (United States)

    van der Westhuijzen, A J; Morkel, J A

    2011-08-01

    In South Africa payments for treatment rendered are routinely delayed because of the medical fund industry's apparent inability to capture codes denoting supernumerary teeth. The suggested protocol allows for up to 13 supernumerary teeth to be identified by two digits. Meetings planned between SADA and key funding stakeholders to "ensure that protocols related to tooth numbering are acceptable", provide the ideal opportunity to introduce the suggested two-digit protocol for numbering supernumerary teeth. If this proposal is implemented, it could alleviate the frustration associated with the rejection of accounts where supernumerary teeth are appropriately identified.

  19. A Critical Reflection on Codes of Conduct in Vocational Education

    Science.gov (United States)

    Bagnall, Richard G.; Nakar, Sonal

    2018-01-01

    The contemporary cultural context may be seen as presenting a moral void in vocational education, sanctioning the ascendency of instrumental epistemology and a proliferation of codes of conduct, to which workplace actions are expected to conform. Important among the purposes of such codes is that of encouraging ethical conduct, but, true to their…

  20. Rationale for Student Dress Codes: A Review of School Handbooks

    Science.gov (United States)

    Freeburg, Elizabeth W.; Workman, Jane E.; Lentz-Hees, Elizabeth S.

    2004-01-01

    Through dress codes, schools establish rules governing student appearance. This study examined stated rationales for dress and appearance codes in secondary school handbooks; 182 handbooks were received. Of 150 handbooks containing a rationale, 117 related dress and appearance regulations to students' right to a non-disruptive educational…

  1. Code-Switching in a Turkish Secondary School.

    Science.gov (United States)

    Eldridge, John

    1996-01-01

    Analyzes English-as-a-Second-Language students' code-switching in a Turkish school. The article shows that no empirical evidence exists supporting the notion that restricting mother tongue use would improve learning efficiency and that most classroom code-switching is intentional. (seven references) (Author/CK)

  2. Motivations For Code-Switching Among Igboenglish Bilinguals: A ...

    African Journals Online (AJOL)

    Studies have shown that code-switching is not a manifestation of mental confusion but a rule-governed behaviour among bilinguals which is motivated by various socio-psychological as well as linguistic factors. It has been observed that code-switching is more predominant among Igbo-English bilinguals compared to any ...

  3. A Case for Dynamic Reverse-code Generation

    DEFF Research Database (Denmark)

    Lee, Jooyong

    2007-01-01

    . These implementations, however, inherently do not scale. As has often been said, the ultimate solution for backtracking is to use reverse code: executing the reverse code restores the previous states of a program. In our earlier work, we presented a method to generate reverse code on the fly while running a debugger....... This article presents a case study of dynamic reverse-code generation. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation can...... outperform the existing backtracking methods in terms of memory efficiency...

  4. The motivational interviewing skill code : Reliability and a critical appraisal

    NARCIS (Netherlands)

    de Jonge, JM; Schippers, GM; Schaap, CPDR

    The Motivational Interviewing Skill Code (MISC) is a coding system developed to measure adherence to motivational interviewing (MI). MI is an effective clinical style used in different treatment situations. Counsellors practising MI have to follow general principles and avoid certain traps. In the

  5. A Method for the Construction of Minimum-Redundancy Codes*

    Indian Academy of Sciences (India)

    agreement between the transmitter and the receiver about the meaning of the code for each message of the ensemble will be called the "ensemble code". Probably ..... [1] CE Shannon, A mathematical theory of communication, BellSys. Tech-J., Vo1.27,pp.398-403,July 1948. [2] R M Fano, The transmission of information, ...

  6. Toward a Code of Conduct for the Presidency

    Science.gov (United States)

    Fleming, J. Christopher

    2012-01-01

    A presidential code of conduct is needed more today than ever before. College and university presidents are being required to do more without the proper training to succeed. Presidents from outside the academy enter academia with normative patterns and codes of conduct that served them well in their previous occupations but now have the potential…

  7. Ethical conduct for research : a code of scientific ethics

    Science.gov (United States)

    Marcia Patton-Mallory; Kathleen Franzreb; Charles Carll; Richard Cline

    2000-01-01

    The USDA Forest Service recently developed and adopted a code of ethical conduct for scientific research and development. The code addresses issues related to research misconduct, such as fabrication, falsification, or plagiarism in proposing, performing, or reviewing research or in reporting research results, as well as issues related to professional misconduct, such...

  8. Coding as a Trojan Horse for Mathematics Education Reform

    Science.gov (United States)

    Gadanidis, George

    2015-01-01

    The history of mathematics educational reform is replete with innovations taken up enthusiastically by early adopters without significant transfer to other classrooms. This paper explores the coupling of coding and mathematics education to create the possibility that coding may serve as a Trojan Horse for mathematics education reform. That is,…

  9. Framework of a Contour Based Depth Map Coding Method

    Science.gov (United States)

    Wang, Minghui; He, Xun; Jin, Xin; Goto, Satoshi

    Stereo-view and multi-view video formats are heavily investigated topics given their vast application potential. Depth Image Based Rendering (DIBR) system has been developed to improve Multiview Video Coding (MVC). Depth image is introduced to synthesize virtual views on the decoder side in this system. Depth image is a piecewise image, which is filled with sharp contours and smooth interior. Contours in a depth image show more importance than interior in view synthesis process. In order to improve the quality of the synthesized views and reduce the bitrate of depth image, a contour based coding strategy is proposed. First, depth image is divided into layers by different depth value intervals. Then regions, which are defined as the basic coding unit in this work, are segmented from each layer. The region is further divided into the contour and the interior. Two different procedures are employed to code contours and interiors respectively. A vector-based strategy is applied to code the contour lines. Straight lines in contours cost few of bits since they are regarded as vectors. Pixels, which are out of straight lines, are coded one by one. Depth values in the interior of a region are modeled by a linear or nonlinear formula. Coefficients in the formula are retrieved by regression. This process is called interior painting. Unlike conventional block based coding method, the residue between original frame and reconstructed frame (by contour rebuilt and interior painting) is not sent to decoder. In this proposal, contour is coded in a lossless way whereas interior is coded in a lossy way. Experimental results show that the proposed Contour Based Depth map Coding (CBDC) achieves a better performance than JMVC (reference software of MVC) in the high quality scenarios.

  10. Code generation: a strategy for neural network simulators.

    Science.gov (United States)

    Goodman, Dan F M

    2010-10-01

    We demonstrate a technique for the design of neural network simulation software, runtime code generation. This technique can be used to give the user complete flexibility in specifying the mathematical model for their simulation in a high level way, along with the speed of code written in a low level language such as C+ +. It can also be used to write code only once but target different hardware platforms, including inexpensive high performance graphics processing units (GPUs). Code generation can be naturally combined with computer algebra systems to provide further simplification and optimisation of the generated code. The technique is quite general and could be applied to any simulation package. We demonstrate it with the 'Brian' simulator ( http://www.briansimulator.org ).

  11. A novel unified coding analytical method for Internet of Things

    Science.gov (United States)

    Sun, Hong; Zhang, JianHong

    2013-08-01

    This paper presents a novel unified coding analytical method for Internet of Things, which abstracts out the `displacement goods' and `physical objects', and expounds the relationship thereof. It details the item coding principles, establishes a one-to-one relationship between three-dimensional spatial coordinates of points and global manufacturers, can infinitely expand, solves the problem of unified coding in production phase and circulation phase with a novel unified coding method, and further explains how to update the item information corresponding to the coding in stages of sale and use, so as to meet the requirement that the Internet of Things can carry out real-time monitoring and intelligentized management to each item.

  12. A solution for automatic parallelization of sequential assembly code

    Directory of Open Access Journals (Sweden)

    Kovačević Đorđe

    2013-01-01

    Full Text Available Since modern multicore processors can execute existing sequential programs only on a single core, there is a strong need for automatic parallelization of program code. Relying on existing algorithms, this paper describes one new software solution tool for parallelization of sequential assembly code. The main goal of this paper is to develop the parallelizator which reads sequential assembler code and at the output provides parallelized code for MIPS processor with multiple cores. The idea is the following: the parser translates assembler input file to program objects suitable for further processing. After that the static single assignment is done. Based on the data flow graph, the parallelization algorithm separates instructions on different cores. Once sequential code is parallelized by the parallelization algorithm, registers are allocated with the algorithm for linear allocation, and the result at the end of the program is distributed assembler code on each of the cores. In the paper we evaluate the speedup of the matrix multiplication example, which was processed by the parallelizator of assembly code. The result is almost linear speedup of code execution, which increases with the number of cores. The speed up on the two cores is 1.99, while on 16 cores the speed up is 13.88.

  13. On DNA codes from a family of chain rings

    Directory of Open Access Journals (Sweden)

    Elif Segah Oztas

    2017-01-01

    Full Text Available In this work, we focus on reversible cyclic codes which correspond to reversible DNA codes or reversible-complement DNA codes over a family of finite chain rings, in an effort to extend what was done by Yildiz and Siap in [20]. The ring family that we have considered are of size $2^{2^k}$, $k=1,2, \\cdots$ and we match each ring element with a DNA $2^{k-1}$-mer. We use the so-called $u^2$-adic digit system to solve the reversibility problem and we characterize cyclic codes that correspond to reversible-complement DNA-codes. We then conclude our study with some examples.

  14. Porting a Hall MHD Code to a Graphic Processing Unit

    Science.gov (United States)

    Dorelli, John C.

    2011-01-01

    We present our experience porting a Hall MHD code to a Graphics Processing Unit (GPU). The code is a 2nd order accurate MUSCL-Hancock scheme which makes use of an HLL Riemann solver to compute numerical fluxes and second-order finite differences to compute the Hall contribution to the electric field. The divergence of the magnetic field is controlled with Dedner?s hyperbolic divergence cleaning method. Preliminary benchmark tests indicate a speedup (relative to a single Nehalem core) of 58x for a double precision calculation. We discuss scaling issues which arise when distributing work across multiple GPUs in a CPU-GPU cluster.

  15. Coded excitation for diverging wave cardiac imaging: a feasibility study

    Science.gov (United States)

    Zhao, Feifei; Tong, Ling; He, Qiong; Luo, Jianwen

    2017-02-01

    Diverging wave (DW) based cardiac imaging has gained increasing interest in recent years given its capacity to achieve ultrahigh frame rate. However, the signal-to-noise ratio (SNR), contrast, and penetration depth of the resulting B-mode images are typically low as DWs spread energy over a large region. Coded excitation is known to be capable of increasing the SNR and penetration for ultrasound imaging. The aim of this study was therefore to test the feasibility of applying coded excitation in DW imaging to improve the corresponding SNR, contrast and penetration depth. To this end, two types of codes, i.e. a linear frequency modulated chirp code and a set of complementary Golay codes were tested in three different DW imaging schemes, i.e. 1 angle DW transmit without compounding, 3 and 5 angles DW transmits with coherent compounding. The performances (SNR, contrast ratio (CR), contrast-to-noise ratio (CNR), and penetration) of different imaging schemes were investigated by means of simulations and in vitro experiments. As for benchmark, corresponding DW imaging schemes with regular pulsed excitation as well as the conventional focused imaging scheme were also included. The results showed that the SNR was improved by about 10 dB using coded excitation while the penetration depth was increased by 2.5 cm and 1.8 cm using chirp code and Golay codes, respectively. The CNR and CR gains varied with the depth for different DW schemes using coded excitations. Specifically, for non-compounded DW imaging schemes, the gain in the CR was about 5 dB and 3 dB while the gain in the CNR was about 4.5 dB and 3.5 dB at larger depths using chirp code and Golay codes, respectively. For compounded imaging schemes, using coded excitation, the gain in the penetration and contrast were relatively smaller compared to non-compounded ones. Overall, these findings indicated the feasibility of coded excitation in improving the image quality of DW imaging. Preliminary in vivo cardiac images

  16. Gasoline2: a modern smoothed particle hydrodynamics code

    Science.gov (United States)

    Wadsley, James W.; Keller, Benjamin W.; Quinn, Thomas R.

    2017-10-01

    The methods in the Gasoline2 smoothed particle hydrodynamics (SPH) code are described and tested. Gasoline2 is the most recent version of the Gasoline code for parallel hydrodynamics and gravity with identical hydrodynamics to the Changa code. As with other Modern SPH codes, we prevent sharp jumps in time-steps, use upgraded kernels and larger neighbour numbers and employ local viscosity limiters. Unique features in Gasoline2 include its Geometric Density Average Force expression, explicit Turbulent Diffusion terms and Gradient-Based shock detection to limit artificial viscosity. This last feature allows Gasoline2 to completely avoid artificial viscosity in non-shocking compressive flows. We present a suite of tests demonstrating the value of these features with the same code configuration and parameter choices used for production simulations.

  17. A multidisciplinary approach to vascular surgery procedure coding improves coding accuracy, work relative value unit assignment, and reimbursement.

    Science.gov (United States)

    Aiello, Francesco A; Judelson, Dejah R; Messina, Louis M; Indes, Jeffrey; FitzGerald, Gordon; Doucet, Danielle R; Simons, Jessica P; Schanzer, Andres

    2016-08-01

    Vascular surgery procedural reimbursement depends on accurate procedural coding and documentation. Despite the critical importance of correct coding, there has been a paucity of research focused on the effect of direct physician involvement. We hypothesize that direct physician involvement in procedural coding will lead to improved coding accuracy, increased work relative value unit (wRVU) assignment, and increased physician reimbursement. This prospective observational cohort study evaluated procedural coding accuracy of fistulograms at an academic medical institution (January-June 2014). All fistulograms were coded by institutional coders (traditional coding) and by a single vascular surgeon whose codes were verified by two institution coders (multidisciplinary coding). The coding methods were compared, and differences were translated into revenue and wRVUs using the Medicare Physician Fee Schedule. Comparison between traditional and multidisciplinary coding was performed for three discrete study periods: baseline (period 1), after a coding education session for physicians and coders (period 2), and after a coding education session with implementation of an operative dictation template (period 3). The accuracy of surgeon operative dictations during each study period was also assessed. An external validation at a second academic institution was performed during period 1 to assess and compare coding accuracy. During period 1, traditional coding resulted in a 4.4% (P = .004) loss in reimbursement and a 5.4% (P = .01) loss in wRVUs compared with multidisciplinary coding. During period 2, no significant difference was found between traditional and multidisciplinary coding in reimbursement (1.3% loss; P = .24) or wRVUs (1.8% loss; P = .20). During period 3, traditional coding yielded a higher overall reimbursement (1.3% gain; P = .26) than multidisciplinary coding. This increase, however, was due to errors by institution coders, with six inappropriately used codes

  18. Software exorcism a handbook for debugging and optimizing legacy code

    CERN Document Server

    Blunden, Bill

    2013-01-01

    Software Exorcism: A Handbook for Debugging and Optimizing Legacy Code takes an unflinching, no bulls and look at behavioral problems in the software engineering industry, shedding much-needed light on the social forces that make it difficult for programmers to do their job. Do you have a co-worker who perpetually writes bad code that you are forced to clean up? This is your book. While there are plenty of books on the market that cover debugging and short-term workarounds for bad code, Reverend Bill Blunden takes a revolutionary step beyond them by bringing our atten

  19. ALEPH2 - A general purpose Monte Carlo depletion code

    Energy Technology Data Exchange (ETDEWEB)

    Stankovskiy, A.; Van Den Eynde, G.; Baeten, P. [SCK CEN, Boeretang 200, B-2400 Mol (Belgium); Trakas, C.; Demy, P. M.; Villatte, L. [AREVA NP, Tour AREVA, Pl. J. Millier, 92084 Paris La Defense (France)

    2012-07-01

    The Monte-Carlo burn-up code ALEPH is being developed at SCK-CEN since 2004. A previous version of the code implemented the coupling between the Monte Carlo transport (any version of MCNP or MCNPX) and the ' deterministic' depletion code ORIGEN-2.2 but had important deficiencies in nuclear data treatment and limitations inherent to ORIGEN-2.2. A new version of the code, ALEPH2, has several unique features making it outstanding among other depletion codes. The most important feature is full data consistency between steady-state Monte Carlo and time-dependent depletion calculations. The last generation general-purpose nuclear data libraries (JEFF-3.1.1, ENDF/B-VII and JENDL-4) are fully implemented, including special purpose activation, spontaneous fission, fission product yield and radioactive decay data. The built-in depletion algorithm allows to eliminate the uncertainties associated with obtaining the time-dependent nuclide concentrations. A predictor-corrector mechanism, calculation of nuclear heating, calculation of decay heat, decay neutron sources are available as well. The validation of the code on the results of REBUS experimental program has been performed. The ALEPH2 has shown better agreement with measured data than other depletion codes. (authors)

  20. EMdeCODE: a novel algorithm capable of reading words of epigenetic code to predict enhancers and retroviral integration sites and to identify H3R2me1 as a distinctive mark of coding versus non-coding genes.

    Science.gov (United States)

    Santoni, Federico Andrea

    2013-02-01

    Existence of some extra-genetic (epigenetic) codes has been postulated since the discovery of the primary genetic code. Evident effects of histone post-translational modifications or DNA methylation over the efficiency and the regulation of DNA processes are supporting this postulation. EMdeCODE is an original algorithm that approximate the genomic distribution of given DNA features (e.g. promoter, enhancer, viral integration) by identifying relevant ChIPSeq profiles of post-translational histone marks or DNA binding proteins and combining them in a supermark. EMdeCODE kernel is essentially a two-step procedure: (i) an expectation-maximization process calculates the mixture of epigenetic factors that maximize the Sensitivity (recall) of the association with the feature under study; (ii) the approximated density is then recursively trimmed with respect to a control dataset to increase the precision by reducing the number of false positives. EMdeCODE densities improve significantly the prediction of enhancer loci and retroviral integration sites with respect to previous methods. Importantly, it can also be used to extract distinctive factors between two arbitrary conditions. Indeed EMdeCODE identifies unexpected epigenetic profiles specific for coding versus non-coding RNA, pointing towards a new role for H3R2me1 in coding regions.

  1. GPU Optimizations for a Production Molecular Docking Code*

    OpenAIRE

    Landaverde, Raphael; Herbordt, Martin C.

    2014-01-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes a...

  2. A program evaluation of classroom data collection with bar codes.

    Science.gov (United States)

    Saunders, M D; Saunders, J L; Saunders, R R

    1993-01-01

    A technology incorporating bar code symbols and hand-held optical scanners was evaluated for its utility for routine data collection in a special education classroom. A different bar code symbol was created for each Individualized Educational Plan objective, each type of response occurrence, and each student in the first author's classroom. These symbols were organized by activity and printed as data sheets. The teacher and paraprofessionals scanned relevant codes with scanners when the students emitted targeted behaviors. The codes, dates, and approximate times of the scans were retained in the scanner's electronic memory until they could be transferred by communication software to a computer file. The data from the computer file were organized weekly into a printed report of student performance using a program written with commercially available database software. Advantages, disadvantages, and costs of using the system are discussed.

  3. A new tree code method for simulation of planetesimal dynamics

    Science.gov (United States)

    Richardson, D. C.

    1993-03-01

    A new tree code method for simulation of planetesimal dynamics is presented. A self-similarity argument is used to restrict the problem to a small patch of a ring of planetesimals at 1 AU from the sun. The code incorporates a sliding box model with periodic boundary conditions and surrounding ghost particles. The tree is self-repairing and exploits the flattened nature of Keplerian disks to maximize efficiency. The code uses a fourth-order force polynomial integration algorithm with individual particle time-steps. Collisions and mergers, which play an important role in planetesimal evolution, are treated in a comprehensive manner. In typical runs with a few hundred central particles, the tree code is approximately 2-3 times faster than a recent direct summation method and requires about 1 CPU day on a Sparc IPX workstation to simulate 100 yr of evolution. The average relative force error incurred in such runs is less than 0.2 per cent in magnitude. In general, the CPU time as a function of particle number varies in a way consistent with an O(N log N) algorithm. In order to take advantage of facilities available, the code was written in C in a Unix workstation environment. The unique aspects of the code are discussed in detail and the results of a number of performance tests - including a comparison with previous work - are presented.

  4. A new QMD code for heavy-ion collisions

    Science.gov (United States)

    Kim, Kyungil; Kim, Youngman; Lee, Kang Seog

    2017-11-01

    We develop a new quantum molecular dynamics (QMD) type nuclear transport code to simulate heavy-ion collisions for RAON, a new accelerator complex under construction in Korea. At RAON, the rare isotope beams with energies from a few MeV/n to a few hundreds MeV/n will be utilized. QMD is one of the widely used theoretical methods and is useful for both theoretical and experimental purposes. We describe our QMD model with the numerical realization. The validity of the code is tested by comparing our simulation results with experimental data and also results from other transport codes in 197Au+197Au collisions at Elab = 90 - 120 MeV/n. Finally, we present a brief discussion on applicability and outlook of our code.

  5. Design of a VLSI Decoder for Partially Structured LDPC Codes

    Directory of Open Access Journals (Sweden)

    Fabrizio Vacca

    2008-01-01

    of their parity matrix can be partitioned into two disjoint sets, namely, the structured and the random ones. For the proposed class of codes a constructive design method is provided. To assess the value of this method the constructed codes performance are presented. From these results, a novel decoding method called split decoding is introduced. Finally, to prove the effectiveness of the proposed approach a whole VLSI decoder is designed and characterized.

  6. FIFPC, a fast ion Fokker--Planck code

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, R.H.; Callen, J.D.; Rome, J.A.; Smith, J.

    1976-07-01

    A computer code is described which solves the Fokker--Planck equation for the velocity space distribution of fast ions injected into a tokamak plasma. The numerical techniques are described and use of the code is outlined. The program is written in FORTRAN IV and is modularized in order to provide greater flexibility to the user. A program listing is provided and the results of sample cases are presented.

  7. GPEC, a real-time capable Tokamak equilibrium code

    CERN Document Server

    Rampp, Markus; Fischer, Rainer

    2015-01-01

    A new parallel equilibrium reconstruction code for tokamak plasmas is presented. GPEC allows to compute equilibrium flux distributions sufficiently accurate to derive parameters for plasma control within 1 ms of runtime which enables real-time applications at the ASDEX Upgrade experiment (AUG) and other machines with a control cycle of at least this size. The underlying algorithms are based on the well-established offline-analysis code CLISTE, following the classical concept of iteratively solving the Grad-Shafranov equation and feeding in diagnostic signals from the experiment. The new code adopts a hybrid parallelization scheme for computing the equilibrium flux distribution and extends the fast, shared-memory-parallel Poisson solver which we have described previously by a distributed computation of the individual Poisson problems corresponding to different basis functions. The code is based entirely on open-source software components and runs on standard server hardware and software environments. The real-...

  8. POPCORN: A comparison of binary population synthesis codes

    Science.gov (United States)

    Claeys, J. S. W.; Toonen, S.; Mennekens, N.

    2013-01-01

    We compare the results of three binary population synthesis codes to understand the differences in their results. As a first result we find that when equalizing the assumptions the results are similar. The main differences arise from deviating physical input.

  9. Becoming Inclusive: A Code of Conduct for Inclusion and Diversity.

    Science.gov (United States)

    Schmidt, Bonnie J; MacWilliams, Brent R; Neal-Boylan, Leslie

    There are increasing concerns about exclusionary behaviors and lack of diversity in the nursing profession. Exclusionary behaviors, which may include incivility, bullying, and workplace violence, discriminate and isolate individuals and groups who are different, whereas inclusive behaviors encourage diversity. To address inclusion and diversity in nursing, this article offers a code of conduct. This code of conduct builds on existing nursing codes of ethics and applies to nursing students and nurses in both educational and practice settings. Inclusive behaviors that are demonstrated in nurses' relationships with patients, colleagues, the profession, and society are described. This code of conduct provides a basis for measureable change, empowerment, and unification of the profession. Recommendations, implications, and a pledge to action are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. A Monte Carlo code for ion beam therapy

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    Initially developed for applications in detector and accelerator physics, the modern Fluka Monte Carlo code is now used in many different areas of nuclear science. Over the last 25 years, the code has evolved to include new features, such as ion beam simulations. Given the growing use of these beams in cancer treatment, Fluka simulations are being used to design treatment plans in several hadron-therapy centres in Europe.   Fluka calculates the dose distribution for a patient treated at CNAO with proton beams. The colour-bar displays the normalized dose values. Fluka is a Monte Carlo code that very accurately simulates electromagnetic and nuclear interactions in matter. In the 1990s, in collaboration with NASA, the code was developed to predict potential radiation hazards received by space crews during possible future trips to Mars. Over the years, it has become the standard tool to investigate beam-machine interactions, radiation damage and radioprotection issues in the CERN accelerator com...

  11. Code Recognition Device for Automobile, a Panacea for Automobiles Theft

    Directory of Open Access Journals (Sweden)

    Ozomata David AHMED

    2011-06-01

    Full Text Available Code Recognition Device is a security device for automobiles. It responds only to the right sequence of codes that are keyed from the key pad. This closes the electrical circuitry of the automobile and enables it to start. If a wrong key is touched, it resets the device which disengages the electrical circuit of the automobile from the power supply. The device works properly on closing all the doors of the automobile, otherwise it cannot start. Also, once the automobile is in operation, opening of any door will disengage the device and the engine will stop. To restart the engine, the doors must be closed and the codes rendered sequentially-in this case the codes are 1974.

  12. FBCOT: a fast block coding option for JPEG 2000

    Science.gov (United States)

    Taubman, David; Naman, Aous; Mathew, Reji

    2017-09-01

    Based on the EBCOT algorithm, JPEG 2000 finds application in many fields, including high performance scientific, geospatial and video coding applications. Beyond digital cinema, JPEG 2000 is also attractive for low-latency video communications. The main obstacle for some of these applications is the relatively high computational complexity of the block coder, especially at high bit-rates. This paper proposes a drop-in replacement for the JPEG 2000 block coding algorithm, achieving much higher encoding and decoding throughputs, with only modest loss in coding efficiency (typically stream. The proposed FAST block coder can be used with EBCOT's post-compression RD-optimization methodology, allowing a target compressed bit-rate to be achieved even at low latencies, leading to the name FBCOT (Fast Block Coding with Optimized Truncation).

  13. The Nuremberg Code and the Nuremberg Trial. A reappraisal.

    Science.gov (United States)

    Katz, J

    1996-11-27

    The Nuremberg Code includes 10 principles to guide physician-investigators in experiments involving human subjects. These principles, particularly the first principle on "voluntary consent," primarily were based on legal concepts because medical codes of ethics existent at the time of the Nazi atrocities did not address consent and other safeguards for human subjects. The US judges who presided over the proceedings did not intend the Code to apply only to the case before them, to be a response to the atrocities committed by the Nazi physicians, or to be inapplicable to research as it is customarily carried on in medical institutions. Instead, a careful reading of the judgment suggests that they wrote the Code for the practice of human experimentation whenever it is being conducted.

  14. Review and Evaluation of a Turbomachinery Throughflow Finite Element Code

    Science.gov (United States)

    1989-06-01

    stare. and ZIP code) 7b Address (city, state, and ZIP code,, Monterev, CA 93943- 5000 Monterey, CA 93943- 50001 Sa Name of Funding Sponsoring...Rotor Tip Section ............................. 52 Figure 27. Computational Mesh for the Blade-to-Blade Solution .............. 53 Figure 28. Iso -pressure...computation, the results are presented here for Case 5 at a flow coefficient of 0.61. The computed iso -pressure lines on the axisvrmrnetric stream surface at

  15. Evolutionary analysis of DNA-protein-coding regions based on a genetic code cube metric.

    Science.gov (United States)

    Sanchez, Robersy

    2014-01-01

    The right estimation of the evolutionary distance between DNA or protein sequences is the cornerstone of the current phylogenetic analysis based on distance methods. Herein, it is demonstrated that the Manhattan distance (dw), weighted by the evolutionary importance of the nucleotide bases in the codon, is a naturally derived metric in the standard genetic code cube inserted into the three-dimensional Euclidean space. Based on the application of distance dw, a novel evolutionary model is proposed. This model includes insertion/deletion mutations that are very important for cancer studies, but usually discarded in classical evolutionary models. In this study, the new evolutionary model was applied to the phylogenetic analysis of the DNA protein-coding regions of 13 mammal mitochondrial genomes and of four cancer genetic- susceptibility genes (ATM, BRCA1, BRCA2 and p53) from nine mammals. The opossum (a marsupial) was used as an out-group species for both sets of sequences. The new evolutionary model yielded the correct topology, while the current models failed to separate the evolutionarily distant species of mouse and opossum.

  16. Code Saturne: A Finite Volume Code for the computation of turbulent incompressible flows - Industrial Applications

    OpenAIRE

    Archambeau, Frédéric; Méchitoua, Namane; Sakiz, Marc

    2004-01-01

    International audience; This paper describes the finite volume method implemented in Code Saturne, Electricite de France general-purpose computational fluid dynamic code for laminar and turbulent flows in complex two and three- dimensional geometries. The code is used for industrial applications and research activities in several fields related to energy production (nuclear power thermal-hydraulics, gas and coal combustion, turbomachinery, heating, ventilation and air conditioning...). The se...

  17. Code Blue Emergencies: A Team Task Analysis and Educational Initiative.

    Science.gov (United States)

    Price, James W; Applegarth, Oliver; Vu, Mark; Price, John R

    2012-01-01

    The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR) and post-operative recovery unit (PAR) at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH) were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Both nursing staff (n = 49) and anesthesiologists (n = 19) supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.

  18. Cyclone Codes

    OpenAIRE

    Schindelhauer, Christian; Jakoby, Andreas; Köhler, Sven

    2016-01-01

    We introduce Cyclone codes which are rateless erasure resilient codes. They combine Pair codes with Luby Transform (LT) codes by computing a code symbol from a random set of data symbols using bitwise XOR and cyclic shift operations. The number of data symbols is chosen according to the Robust Soliton distribution. XOR and cyclic shift operations establish a unitary commutative ring if data symbols have a length of $p-1$ bits, for some prime number $p$. We consider the graph given by code sym...

  19. FLASH: A finite element computer code for variably saturated flow

    Energy Technology Data Exchange (ETDEWEB)

    Baca, R.G.; Magnuson, S.O.

    1992-05-01

    A numerical model was developed for use in performance assessment studies at the INEL. The numerical model, referred to as the FLASH computer code, is designed to simulate two-dimensional fluid flow in fractured-porous media. The code is specifically designed to model variably saturated flow in an arid site vadose zone and saturated flow in an unconfined aquifer. In addition, the code also has the capability to simulate heat conduction in the vadose zone. This report presents the following: description of the conceptual frame-work and mathematical theory; derivations of the finite element techniques and algorithms; computational examples that illustrate the capability of the code; and input instructions for the general use of the code. The FLASH computer code is aimed at providing environmental scientists at the INEL with a predictive tool for the subsurface water pathway. This numerical model is expected to be widely used in performance assessments for: (1) the Remedial Investigation/Feasibility Study process and (2) compliance studies required by the US Department of Energy Order 5820.2A.

  20. Conjugate heat transfer study of a wire spacer SFR fuel assembly thanks to the thermal code SYRTHES and the CFD code Code_Saturne

    Science.gov (United States)

    Péniguel, C.; Rupp, I.; Rolfo, S.; Hermouet, D.

    2014-06-01

    The paper presents a HPC calculation of a conjugate heat transfer simulation in fuel assembly as those found in liquid metal coolant fast reactors. The wire spacers, helically wound along each pin axis, generate a strong secondary flow pattern in opposition to smooth pins. Assemblies with a range of pins going from 7 to 271 have been simulated, 271 pins corresponding to the industrial case. Both the fluid domain, as well as the solid part, are detailed leading to large meshes. The fluid is handled by the CFD code Code_Saturne using 98 million cells, while the solid domain is taken care of thanks to the thermal code SYRTHES on meshes up to 240 million cells. Both codes are fully parallel and run on cluster with hundreds of processors. Simulations allow access to the temperature field in nominal conditions and degraded situations.

  1. PROSA-1: a probabilistic response-surface analysis code. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, J. K.; Mueller, C.

    1978-06-01

    Techniques for probabilistic response-surface analysis have been developed to obtain the probability distributions of the consequences of postulated nuclear-reactor accidents. The uncertainties of the consequences are caused by the variability of the system and model input parameters used in the accident analysis. Probability distributions are assigned to the input parameters, and parameter values are systematically chosen from these distributions. These input parameters are then used in deterministic consequence analyses performed by mechanistic accident-analysis codes. The results of these deterministic consequence analyses are used to generate the coefficients for analytical functions that approximate the consequences in terms of the selected input parameters. These approximating functions are used to generate the probability distributions of the consequences with random sampling being used to obtain values for the accident parameters from their distributions. A computer code PROSA has been developed for implementing the probabilistic response-surface technique. Special features of the code generate or treat sensitivities, statistical moments of the input and output variables, regionwise response surfaces, correlated input parameters, and conditional distributions. The code can also be used for calculating important distributions of the input parameters. The use of the code is illustrated in conjunction with the fast-running accident-analysis code SACO to provide probability studies of LMFBR hypothetical core-disruptive accidents. However, the methods and the programming are general and not limited to such applications.

  2. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  3. HADES, A Code for Simulating a Variety of Radiographic Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M B; Henderson, G; von Wittenau, A; Slone, D M; Barty, A; Martz, Jr., H E

    2004-10-28

    It is often useful to simulate radiographic images in order to optimize imaging trade-offs and to test tomographic techniques. HADES is a code that simulates radiography using ray tracing techniques. Although originally developed to simulate X-Ray transmission radiography, HADES has grown to simulate neutron radiography over a wide range of energy, proton radiography in the 1 MeV to 100 GeV range, and recently phase contrast radiography using X-Rays in the keV energy range. HADES can simulate parallel-ray or cone-beam radiography through a variety of mesh types, as well as through collections of geometric objects. HADES was originally developed for nondestructive evaluation (NDE) applications, but could be a useful tool for simulation of portal imaging, proton therapy imaging, and synchrotron studies of tissue. In this paper we describe HADES' current capabilities and discuss plans for a major revision of the code.

  4. A new approach to codeword stabilized quantum codes using the algebraic structure of modules

    OpenAIRE

    Santiago, Douglas Frederico Guimarães; Otoni, Geraldo Samuel Sena

    2015-01-01

    In this work, we study the Codeword Stabilized Quantum Codes (CWS codes) a generalization of the stabilizers quantum codes using a new approach, the algebraic structure of modules, a generalization of linear spaces. We show then a new result that relates CWS codes with stabilizer codes generalizing results in the literature.

  5. A colorful origin for the genetic code: information theory, statistical mechanics and the emergence of molecular codes.

    Science.gov (United States)

    Tlusty, Tsvi

    2010-09-01

    The genetic code maps the sixty-four nucleotide triplets (codons) to twenty amino-acids. While the biochemical details of this code were unraveled long ago, its origin is still obscure. We review information-theoretic approaches to the problem of the code's origin and discuss the results of a recent work that treats the code in terms of an evolving, error-prone information channel. Our model - which utilizes the rate-distortion theory of noisy communication channels - suggests that the genetic code originated as a result of the interplay of the three conflicting evolutionary forces: the needs for diverse amino-acids, for error-tolerance and for minimal cost of resources. The description of the code as an information channel allows us to mathematically identify the fitness of the code and locate its emergence at a second-order phase transition when the mapping of codons to amino-acids becomes nonrandom. The noise in the channel brings about an error-graph, in which edges connect codons that are likely to be confused. The emergence of the code is governed by the topology of the error-graph, which determines the lowest modes of the graph-Laplacian and is related to the map coloring problem. (c) 2010 Elsevier B.V. All rights reserved.

  6. A need for a code of ethics in science communication?

    Science.gov (United States)

    Benestad, R. E.

    2009-09-01

    The modern western civilization and high standard of living are to a large extent the 'fruits' of scientific endeavor over generations. Some examples include the longer life expectancy due to progress in medical sciences, and changes in infrastructure associated with the utilization of electromagnetism. Modern meteorology is not possible without the state-of-the-art digital computers, satellites, remote sensing, and communications. Science also is of relevance for policy making, e.g. the present hot topic of climate change. Climate scientists have recently become much exposed to media focus and mass communications, a task for which many are not trained. Furthermore, science, communication, and politics have different objectives, and do not necessarily mix. Scientists have an obligation to provide unbiased information, and a code of ethics is needed to give a guidance for acceptable and unacceptable conduct. Some examples of questionable conduct in Norway include using the title 'Ph.D' to imply scientific authority when the person never had obtained such an academic degree, or writing biased and one-sided articles in Norwegian encyclopedia that do not reflect the scientific consensus. It is proposed here that a set of guide lines (for the scientists and journalists) and a code of conduct could provide recommendation for regarding how to act in media - similar to a code of conduct with respect to carrying out research - to which everyone could agree, even when disagreeing on specific scientific questions.

  7. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  8. HipGISAXS: A Massively Parallel Code for GISAXS Simulation

    Science.gov (United States)

    Chourou, Slim; Sarje, Abhinav; Li, Xiaoye; Chan, Elaine; Hexemer, Alexander; Hipgisaxs Team

    2013-03-01

    Grazing Incidence Small-Angle Scattering (GISAXS) is a valuable experimental technique in probing nanostructures of relevance to polymer science. New high-performance computing algorithms, codes, and software tools have been implemented to analyze GISAXS images generated at synchrotron light sources. We have developed flexible massively parallel GISAXS simulation software ``HipGISAXS'' based on the Distorted Wave Born Approximation (DWBA). The software computes the diffraction pattern for any given superposition of custom shapes or morphologies in a user-defined region of the reciprocal space for all possible grazing incidence angles and sample rotations. This flexibility allows a straightforward study of a wide variety of possible polymer topologies and assemblies whether embedded in a thin film or a multilayered structure. Hence, this code enables guided investigations of the morphological and dynamical properties of relevance in various applications. The current parallel code is capable of computing GISAXS images for highly complex structures and with high resolutions and attaining speedups of 200x on a single-node GPU compared to the sequential code. Moreover, the multi-GPU (CPU) code achieved additional 900x (4000x) speedup on 930 GPU (6000 CPU) nodes. This work was supported by the Director, Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  9. A neutron spectrum unfolding code based on iterative procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Apdo. Postal 336, 98000 Zacatecas (Mexico)

    2012-10-15

    In this work, the version 3.0 of the neutron spectrum unfolding code called Neutron Spectrometry and Dosimetry from Universidad Autonoma de Zacatecas (NSDUAZ), is presented. This code was designed in a graphical interface under the LabVIEW programming environment and it is based on the iterative SPUNIT iterative algorithm, using as entrance data, only the rate counts obtained with 7 Bonner spheres based on a {sup 6}Lil(Eu) neutron detector. The main features of the code are: it is intuitive and friendly to the user; it has a programming routine which automatically selects the initial guess spectrum by using a set of neutron spectra compiled by the International Atomic Energy Agency. Besides the neutron spectrum, this code calculates the total flux, the mean energy, H(10), h(10), 15 dosimetric quantities for radiation protection porpoises and 7 survey meter responses, in four energy grids, based on the International Atomic Energy Agency compilation. This code generates a full report in html format with all relevant information. In this work, the neutron spectrum of a {sup 241}AmBe neutron source on air, located at 150 cm from detector, is unfolded. (Author)

  10. code {poems}

    Directory of Open Access Journals (Sweden)

    Ishac Bertran

    2012-08-01

    Full Text Available "Exploring the potential of code to communicate at the level of poetry," the code­ {poems} project solicited submissions from code­writers in response to the notion of a poem, written in a software language which is semantically valid. These selections reveal the inner workings, constitutive elements, and styles of both a particular software and its authors.

  11. GPU Optimizations for a Production Molecular Docking Code.

    Science.gov (United States)

    Landaverde, Raphael; Herbordt, Martin C

    2014-09-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.

  12. GPU Optimizations for a Production Molecular Docking Code*

    Science.gov (United States)

    Landaverde, Raphael; Herbordt, Martin C.

    2015-01-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667

  13. RODMOD: a code for control rod positioning. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.

    1978-11-01

    The report documents a computer code which has been implemented to position control rods according to a prescribed schedule during the calculation of a reactor history. Control rods may be represented explicitly with or without internal black absorber conditions in selected energy groups, or fractional insertion may be done, or both, in a problem. There is provision for control rod follower, movement of materials through a series of zones in a closed loop, and shutdown rod insertion and subsequent removal to allow the reactor history calculation to be continued. This code is incorporated in the system containing the VENTURE diffusion theory neutronics and the BURNER exposure codes for routine use. The implemented automated procedures cause the prescribed control rod insertion schedule to be applied without the access of additional user input data during the calculation of a reactor operating history.

  14. A Coach's Code of Conduct. Position Statement

    Science.gov (United States)

    Lyman, Linda; Ewing, Marty; Martino, Nan

    2009-01-01

    Coaches exert a profound impact on our youths; therefore, society sets high expectations for them. As such, whether coaches are compensated or work solely as volunteers, they are responsible for executing coaching as a professional. If we are to continue to enhance the cultural perceptions of coaching, we must strive to develop and master the…

  15. A neural coding scheme reproducing foraging trajectories

    Science.gov (United States)

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-12-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  16. A TDM link with channel coding and digital voice.

    Science.gov (United States)

    Jones, M. W.; Tu, K.; Harton, P. L.

    1972-01-01

    The features of a TDM (time-division multiplexed) link model are described. A PCM telemetry sequence was coded for error correction and multiplexed with a digitized voice channel. An all-digital implementation of a variable-slope delta modulation algorithm was used to digitize the voice channel. The results of extensive testing are reported. The measured coding gain and the system performance over a Gaussian channel are compared with theoretical predictions and computer simulations. Word intelligibility scores are reported as a measure of voice channel performance.

  17. A STUDY OF CODE SWITCHING AND CODE MIXING IN EFL CLASSROOM: A SURVEY OF CLASSROOM INTERACTION AT ENGLISH EDUCATION STUDY PROGRAM OF UIN RADEN FATAH PALEMBANG

    Directory of Open Access Journals (Sweden)

    Annisa Astrid

    2017-04-01

    Full Text Available The Study was conducted in oder to find out a phenomena of code switching and code mixing happened in EFL classrom. The writer collected the data from English Education Study Program of UIN Raden Fatah Palembang. Four classes were observed to have the phenomena of code switching and code mixing. A set of questionnaire was given to 120 students and 15 lecturers of English in order to assess their attitude and feedback toward the use of code switching and code mixing. The Results of the research study showed that the lecturers and the students employed code switching and code mixing in the interactions which happened along teaching and learning activities with various patterns and considerations. Finally the data from the questionnaire reflects the positive attitude toward the use of code switching and code mixing along teaching and learning activities in the classroom.

  18. A unified model of the standard genetic code.

    Science.gov (United States)

    José, Marco V; Zamudio, Gabriel S; Morgado, Eberto R

    2017-03-01

    The Rodin-Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model.

  19. Development of a subchannel analysis code MATRA (Ver. {alpha})

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Y. J.; Hwang, D. H

    1998-04-01

    A subchannel analysis code MATRA-{alpha}, an interim version of MATRA, has been developed to be run on an IBM PC or HP WS based on the existing CDC CYBER mainframe version of COBRA-IV-I. This MATRA code is a thermal-hydraulic analysis code based on the subchannel approach for calculating the enthalpy and flow distribution in fuel assemblies and reactor cores for both steady-state and transient conditions. MATRA-{alpha} has been provided with an improved structure, various functions, and models to give the more convenient user environment and to increase the code accuracy, various functions, and models to give the more convenient user environment and to increase the code accuracy. Among them, the pressure drop model has been improved to be applied to non-square-lattice rod arrays, and the lateral transport models between adjacent subchannels have been improved to increase the accuracy in predicting two-phase flow phenomena. Also included in this report are the detailed instructions for input data preparation and for auxiliary pre-processors to serve as a guide to those who want to use MATRA-{alpha}. In addition, we compared the predictions of MATRA-{alpha} with the experimental data on the flow and enthalpy distribution in three sample rod-bundle cases to evaluate the performance of MATRA-{alpha}. All the results revealed that the prediction of MATRA-{alpha} were better than those of COBRA-IV-I. (author). 16 refs., 1 tab., 13 figs.

  20. Communicating pictures a course in image and video coding

    CERN Document Server

    Bull, David R

    2014-01-01

    Communicating Pictures starts with a unique historical perspective of the role of images in communications and then builds on this to explain the applications and requirements of a modern video coding system. It draws on the author's extensive academic and professional experience of signal processing and video coding to deliver a text that is algorithmically rigorous, yet accessible, relevant to modern standards, and practical. It offers a thorough grounding in visual perception, and demonstrates how modern image and video compression methods can be designed in order to meet the rate-quality performance levels demanded by today's applications, networks and users. With this book you will learn: Practical issues when implementing a codec, such as picture boundary extension and complexity reduction, with particular emphasis on efficient algorithms for transforms, motion estimators and error resilience Conflicts between conventional video compression, based on variable length coding and spatiotemporal prediction,...

  1. A code for optimising triplet layout

    CERN Document Server

    AUTHOR|(CDS)2141109; Seryi, Andrei; Abelleira, Jose; Cruz Alaniz, Emilia

    2017-01-01

    One of the main challenges when designing final focus systems of particle accelerators is maximising the beam stay clear in the strong quadrupole magnets of the inner triplet. Moreover it is desirable to keep the quadrupoles in the inner triplet as short as possible for space and costs reasons but also to reduce chromaticity and simplify corrections schemes. An algorithm that explores the triplet parameter space to optimise both these aspects was written. It uses thin lenses as a first approximation for a broad parameter scan and MADX for more precise calculations. The thin lens algorithm is significantly faster than a full scan using MADX and relatively precise at indicating the approximate area where the optimum solution lies.

  2. Requirements for a multifunctional code architecture

    Energy Technology Data Exchange (ETDEWEB)

    Tiihonen, O. [VTT Energy (Finland); Juslin, K. [VTT Automation (Finland)

    1997-07-01

    The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results are managed.

  3. Grounded Theorising Applied to IS Research - Developing a Coding Strategy

    Directory of Open Access Journals (Sweden)

    Bruce Rowlands

    2005-05-01

    Full Text Available This paper provides an example of developing a coding strategy to build theory of the roles of methods in IS development. The research seeks to identify and understand how system development methods are used in an IS department within a large Australian bank. The paper details a theoretical framework, particulars of data collection, and documents an early phase of analysis – data reduction and the generation of an initial coding scheme. Guided by a framework to study the use of methods, the analysis demonstrates the framework’s plausibility in order to develop theoretical relationships with which to develop a grounded theory.

  4. Start App: a coding experience between primary and secondary school

    Directory of Open Access Journals (Sweden)

    Filippo Bruni

    2016-04-01

    Full Text Available The paper presents a coding experience in primary school (“Colozza” in Campobasso. Within the theoretical framework offered by computational thinking, using App Inventor, it was created a calculator for smartphone in the Android environment. High school students (from a technical secondary school guided the pupils in primary school, making an interesting form of cooperation between primary and secondary schools. Start App: una esperienza di coding tra scuola primaria e scuola secondariaIl contributo presenta una esperienza di coding nella scuola primaria dell’Istituto Comprensivo statale “Colozza” di Campobasso. All’interno della cornice teorica offerta dal pensiero computazionale, utilizzando App Inventor, è stata realizzata una calcolatrice per smartphone in ambiente Android. A guidare gli allievi della primaria sono stati gli studenti dell’Istituto Tecnico Industriale “Marconi” di Campobasso realizzando una interessante forma di collaborazione tra istituti scolastici di ordine diverso.

  5. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  6. SCAMPI: A code package for cross-section processing

    Energy Technology Data Exchange (ETDEWEB)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-04-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

  7. Coding Education in a Flipped Classroom

    Directory of Open Access Journals (Sweden)

    Vasfi Tugun

    2017-08-01

    Full Text Available The main purpose of this research is to determine the influence of the flipped classroom model on digital game development and student views on the model. 9th grade students attending Bilişim Teknolojiler II at secondary level participated in the study. The research is an experimental research, designed according to the pretest-posttest research model with experimental and control groups. In the experimental group, the lectures were carried out according to the flipped classroom model while the control group was taught with the traditional methods in the laboratory environment. As a result of the research, the success of the digital game development and the opinions of the students were favored by the experimental group students who were educated with the flipped classroom model. The results obtained in the last part of the study and suggestions for the results are discussed.

  8. Construction and decoding of a class of algebraic geometry codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Jensen, Helge Elbrønd

    1989-01-01

    A class of codes derived from algebraic plane curves is constructed. The concepts and results from algebraic geometry that were used are explained in detail; no further knowledge of algebraic geometry is needed. Parameters, generator and parity-check matrices are given. The main result is a decod......A class of codes derived from algebraic plane curves is constructed. The concepts and results from algebraic geometry that were used are explained in detail; no further knowledge of algebraic geometry is needed. Parameters, generator and parity-check matrices are given. The main result...

  9. A Unique Perspective on Data Coding and Decoding

    Directory of Open Access Journals (Sweden)

    Wen-Yan Wang

    2010-12-01

    Full Text Available The concept of a loss-less data compression coding method is proposed, and a detailed description of each of its steps follows. Using the Calgary Corpus and Wikipedia data as the experimental samples and compared with existing algorithms, like PAQ or PPMstr, the new coding method could not only compress the source data, but also further re-compress the data produced by the other compression algorithms. The final files are smaller, and by comparison with the original compression ratio, at least 1% redundancy could be eliminated. The new method is simple and easy to realize. Its theoretical foundation is currently under study. The corresponding Matlab source code is provided in  the Appendix.

  10. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    Science.gov (United States)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  11. CALTRANS: A parallel, deterministic, 3D neutronics code

    Energy Technology Data Exchange (ETDEWEB)

    Carson, L.; Ferguson, J.; Rogers, J.

    1994-04-01

    Our efforts to parallelize the deterministic solution of the neutron transport equation has culminated in a new neutronics code CALTRANS, which has full 3D capability. In this article, we describe the layout and algorithms of CALTRANS and present performance measurements of the code on a variety of platforms. Explicit implementation of the parallel algorithms of CALTRANS using both the function calls of the Parallel Virtual Machine software package (PVM 3.2) and the Meiko CS-2 tagged message passing library (based on the Intel NX/2 interface) are provided in appendices.

  12. Code Blue Emergencies: A Team Task Analysis and Educational Initiative

    Directory of Open Access Journals (Sweden)

    James W. Price

    2012-04-01

    Full Text Available Introduction: The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR and post-operative recovery unit (PAR at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. Methods: In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Results: Both nursing staff (n = 49 and anesthesiologists (n = 19 supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Conclusion: Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.

  13. A Practical View on Tunable Sparse Network Coding

    DEFF Research Database (Denmark)

    Sørensen, Chres Wiant; Shahbaz Badr, Arash; Cabrera Guerrero, Juan Alberto

    2015-01-01

    Tunable sparse network coding (TSNC) constitutes a promising concept for trading off computational complexity and delay performance. This paper advocates for the use of judicious feedback as a key not only to make TSNC practical, but also to deliver a highly consistent and controlled delay......) can result in a radical improvement of the complexity-delay trade-off....

  14. Evolution of the genetic code: partial optimization of a random code for robustness to translation error in a rugged fitness landscape.

    Science.gov (United States)

    Novozhilov, Artem S; Wolf, Yuri I; Koonin, Eugene V

    2007-10-23

    The standard genetic code table has a distinctly non-random structure, with similar amino acids often encoded by codons series that differ by a single nucleotide substitution, typically, in the third or the first position of the codon. It has been repeatedly argued that this structure of the code results from selective optimization for robustness to translation errors such that translational misreading has the minimal adverse effect. Indeed, it has been shown in several studies that the standard code is more robust than a substantial majority of random codes. However, it remains unclear how much evolution the standard code underwent, what is the level of optimization, and what is the likely starting point. We explored possible evolutionary trajectories of the genetic code within a limited domain of the vast space of possible codes. Only those codes were analyzed for robustness to translation error that possess the same block structure and the same degree of degeneracy as the standard code. This choice of a small part of the vast space of possible codes is based on the notion that the block structure of the standard code is a consequence of the structure of the complex between the cognate tRNA and the codon in mRNA where the third base of the codon plays a minimum role as a specificity determinant. Within this part of the fitness landscape, a simple evolutionary algorithm, with elementary evolutionary steps comprising swaps of four-codon or two-codon series, was employed to investigate the optimization of codes for the maximum attainable robustness. The properties of the standard code were compared to the properties of four sets of codes, namely, purely random codes, random codes that are more robust than the standard code, and two sets of codes that resulted from optimization of the first two sets. The comparison of these sets of codes with the standard code and its locally optimized version showed that, on average, optimization of random codes yielded evolutionary

  15. Dental Faculty Accuracy When Using Diagnostic Codes: A Pilot Study.

    Science.gov (United States)

    Sutton, Jeanne C; Fay, Rose-Marie; Huynh, Carolyn P; Johnson, Cleverick D; Zhu, Liang; Quock, Ryan L

    2017-05-01

    The aim of this study was to examine the accuracy of dental faculty members' utilization of diagnostic codes and resulting treatment planning based on radiographic interproximal tooth radiolucencies. In 2015, 50 full-time and part-time general dentistry faculty members at one U.S. dental school were shown a sequence of 15 bitewing radiographs; one interproximal radiolucency was highlighted on each bitewing. For each radiographic lesion, participants were asked to choose the most appropriate diagnostic code (from a concise list of five codes, corresponding to lesion progression to outer/inner halves of enamel and outer/middle/pulpal thirds of dentin), acute treatment (attempt to arrest/remineralize non-invasively, operative intervention, or no treatment), and level of confidence in choices. Diagnostic and treatment choices of participants were compared to "gold standard" correct responses, as determined by expert radiology and operative faculty members, respectively. The majority of the participants selected the correct diagnostic code for lesions in the outer one-third of dentin (p<0.0001) and the pulpal one-third of dentin (p<0.0001). For lesions in the outer and inner halves of enamel and the middle one-third of dentin, the correct rates were moderate. However, the majority of the participants chose correct treatments on all types of lesions (correct rate 63.6-100%). Faculty members' confidence in their responses was generally high for all lesions, all above 90%. Diagnostic codes were appropriately assigned by participants for the very deepest lesions, but they were not assigned accurately for more incipient lesions (limited to enamel). Paradoxically, treatment choices were generally correct, regardless of diagnostic choices. Further calibration is needed to improve faculty use and teaching of diagnostic codes.

  16. On Predictive Coding for Erasure Channels Using a Kalman Framework

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Murthi, Manohar; Andersen, Søren Vang

    2009-01-01

    We present a new design method for robust low-delay coding of auto-regressive (AR) sources for transmission across erasure channels. The method is based on Linear Predictive Coding (LPC) with Kalman estimation at the decoder. The method designs the encoder and decoder off-line through an iterative...... algorithm based on minimization of the trace of the decoder state error covariance. The design method applies to stationary AR sources of any order. Simulation results show considerable performance gains, when the transmitted quantized prediction errors are subject to loss, in terms of Signal-to-Noise Ratio...

  17. On Predictive Coding for Erasure Channels Using a Kalman Framework

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Murthi, Manohar; Andersen, Søren Vang

    2009-01-01

    We present a new design method for robust low-delay coding of autoregressive sources for transmission across erasure channels. It is a fundamental rethinking of existing concepts. It considers the encoder a mechanism that produces signal measurements from which the decoder estimates the original ...

  18. Code Generation for a Simple First-Order Prover

    DEFF Research Database (Denmark)

    Villadsen, Jørgen; Schlichtkrull, Anders; Halkjær From, Andreas

    2016-01-01

    We present Standard ML code generation in Isabelle/HOL of a sound and complete prover for first-order logic, taking formalizations by Tom Ridge and others as the starting point. We also define a set of so-called unfolding rules and show how to use these as a simple prover, with the aim of using...

  19. Block truncation coding with color clumps: A novel feature extraction ...

    Indian Academy of Sciences (India)

    Block truncation coding with color clumps:A novel feature extraction technique for content based image classification ... Department of Information Technology, Xavier Institute of Social Service, Ranchi, Jharkhand 834001, India; A.K. Choudhury School of Information Technology, University of Calcutta, Kolkata 700 009, India ...

  20. A skin colour code for the Nigerian (Negroid) population SUMMARY ...

    African Journals Online (AJOL)

    kemrilib

    SUMMARY. Some researchers have codified various people of different racial and pigment backgrounds into skin types. The West African native population generally falls into type VI – least likely to burn. There is a need for skin colour code in a multiethnic country like Nigeria especially for the purpose of health matters.

  1. Evaluating QR Code Case Studies Using a Mobile Learning Framework

    Science.gov (United States)

    Rikala, Jenni

    2014-01-01

    The aim of this study was to evaluate the feasibility of Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The feasibility was analyzed through a mobile learning framework, which includes the core characteristics of mobile learning. The study is part of a larger research where the aim is to develop a…

  2. GERMINAL — A computer code for predicting fuel pin behaviour

    Science.gov (United States)

    Melis, J. C.; Roche, L.; Piron, J. P.; Truffert, J.

    1992-06-01

    In the frame of the R and D on FBR fuels, CEA/DEC is developing the computer code GERMINAL to study the fuel pin thermal-mechanical behaviour during steady-state and incidental conditions. The development of GERMINAL is foreseen in two steps: (1) The GERMINAL 1 code designed as a "working horse" for immediate applications. The version 1 of GERMINAL 1 is presently delivered fully documented with a physical qualification guaranteed up to 8 at%. (2) The version 2 of GERMINAL 1, in addition to what is presently treated in GERMINAL 1 includes the treatment of high burnup effects on the fission gas release and the fuel-clad joint. This version, GERMINAL 1.2, is presently under testing and will be completed up to the end of 1991. The GERMINAL 2 code designed as a reference code for future applications will cover all the aspects of GERMINAL 1 (including high burnup effects) with a more general mechanical treatment, and a completely revised and advanced informatical structure.

  3. Comparisons of time explicit hybrid kinetic-fluid code Architect for Plasma Wakefield Acceleration with a full PIC code

    Energy Technology Data Exchange (ETDEWEB)

    Massimo, F., E-mail: francesco.massimo@ensta-paristech.fr [Laboratoire d' Optique Appliquée, ENSTA ParisTech, CNRS, École Polytechnique, Université Paris-Saclay, 828 bd des Maréchaux, 91762 Palaiseau (France); Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma (Italy); Atzeni, S. [Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma (Italy); Marocchino, A. [Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma (Italy); INFN – LNF, via Enrico Fermi 40, 00044 Frascati (Italy)

    2016-12-15

    Architect, a time explicit hybrid code designed to perform quick simulations for electron driven plasma wakefield acceleration, is described. In order to obtain beam quality acceptable for applications, control of the beam-plasma-dynamics is necessary. Particle in Cell (PIC) codes represent the state-of-the-art technique to investigate the underlying physics and possible experimental scenarios; however PIC codes demand the necessity of heavy computational resources. Architect code substantially reduces the need for computational resources by using a hybrid approach: relativistic electron bunches are treated kinetically as in a PIC code and the background plasma as a fluid. Cylindrical symmetry is assumed for the solution of the electromagnetic fields and fluid equations. In this paper both the underlying algorithms as well as a comparison with a fully three dimensional particle in cell code are reported. The comparison highlights the good agreement between the two models up to the weakly non-linear regimes. In highly non-linear regimes the two models only disagree in a localized region, where the plasma electrons expelled by the bunch close up at the end of the first plasma oscillation.

  4. Comparisons of time explicit hybrid kinetic-fluid code Architect for Plasma Wakefield Acceleration with a full PIC code

    Science.gov (United States)

    Massimo, F.; Atzeni, S.; Marocchino, A.

    2016-12-01

    Architect, a time explicit hybrid code designed to perform quick simulations for electron driven plasma wakefield acceleration, is described. In order to obtain beam quality acceptable for applications, control of the beam-plasma-dynamics is necessary. Particle in Cell (PIC) codes represent the state-of-the-art technique to investigate the underlying physics and possible experimental scenarios; however PIC codes demand the necessity of heavy computational resources. Architect code substantially reduces the need for computational resources by using a hybrid approach: relativistic electron bunches are treated kinetically as in a PIC code and the background plasma as a fluid. Cylindrical symmetry is assumed for the solution of the electromagnetic fields and fluid equations. In this paper both the underlying algorithms as well as a comparison with a fully three dimensional particle in cell code are reported. The comparison highlights the good agreement between the two models up to the weakly non-linear regimes. In highly non-linear regimes the two models only disagree in a localized region, where the plasma electrons expelled by the bunch close up at the end of the first plasma oscillation.

  5. Tagalog-English Code Switching as a Mode of Discourse

    Science.gov (United States)

    Bautista, Maria Lourdes S.

    2004-01-01

    The alternation of Tagalog and English in informal discourse is a feature of the linguistic repertoire of educated, middle- and upper-class Filipinos. This paper describes the linguistic structure and sociolinguistic functions of Tagalog-English code switching (Taglish) as provided by various researchers through the years. It shows that the…

  6. ISODEP, A Fuel Depletion Analysis Code for Predicting Isotopic ...

    African Journals Online (AJOL)

    The code ISODEP is developed to compute the rate of production of nuclides in different homogeneous zones of a research reactor. An exponential method proposed by Hansen is the basis for the numerical solution of non-homogeneous simultaneous equations which describe the rate of production and decay of nuclides.

  7. Organizing conceptual knowledge in humans with a gridlike code

    NARCIS (Netherlands)

    Constantinescu, A.O.; O'Reilly, J.X.; Behrens, T.E.J.

    2016-01-01

    Grid cells are thought to provide the neuronal code that underlies spatial knowledge in the brain. Grid cells have mostly been studied in the context of path integration. However, recent theoretical studies have suggested that they may have a broader role in the organization of general knowledge.

  8. Designing an Effective and Efficient Insolvency Code for a ...

    African Journals Online (AJOL)

    During government ownership public corporations dominated the economy an some used to be subsidized and kept going even when they were technically bankrupt. Given that situation the insolvency code was not playing its role and its shortcomings could not be detected. Bankruptcy laws have a major impact on ...

  9. Scaffolded Code-switching: A resource for achieving academic ...

    African Journals Online (AJOL)

    The aim of this paper is to establish whether code-switching is still common practice in rural Limpopo as it was 16 years ago (McCabe, 1996) and if so, to suggest ways to use it as a resource to aid comprehension of English and to explicitly teach cognitive skills and academic literacy. Many rural South African schools have ...

  10. A post-processor for the PEST code

    Energy Technology Data Exchange (ETDEWEB)

    Priesche, S.; Manickam, J.; Johnson, J.L.

    1992-01-01

    A new post-processor has been developed for use with output from the PEST tokamak stability code. It allows us to use quantities calculated by PEST and take better advantage of the physical picture of the plasma instability which they can provide. This will improve comparison with experimentally measured quantities as well as facilitate understanding of theoretical studies.

  11. FLUKA A multi-particle transport code (program version 2005)

    CERN Document Server

    Ferrari, A; Fassò, A; Ranft, Johannes

    2005-01-01

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner’s guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  12. A skin colour code for the Nigerian (Negroid) population | George ...

    African Journals Online (AJOL)

    ... for the Negroid skin in Nigeria including the Nigerian albino. The chart can be laminated using thin transparent plastic film to prevent transmission of infection from skin to skin in different people. A skin colour code can be useful for clinical evaluation of disease conditions like vitiligo as well as for epidemiological studies.

  13. A New Algorithm of Shape Boundaries Based on Chain Coding

    Directory of Open Access Journals (Sweden)

    Zhao Xin

    2017-01-01

    Full Text Available A new method to obtain connected component in binary images is presented. The method uses DFA automaton to obtain chain code and label the component boundary. It is theoretically proved that the algorithm improves the image encoding efficiency closer to the lowest time consumption.

  14. CERN access card: Introduction of a bar code

    CERN Multimedia

    Relations with the Host States Service

    2004-01-01

    Before the latest version of the implementation measures relating to Operational Circular No. 2 comes into force, we would like to inform you that, in future, CERN access cards may bear a bar code to transcribe the holder's identification number. Relations with the Host States Service http://www.cern.ch/relations/ Tel. 72848

  15. CERN access cards - Introduction of a bar code (Reminder)

    CERN Multimedia

    Relations with the Host States Service

    2004-01-01

    In accordance with the latest revised version of the implementation measures relating to Operational Circular No. 2, CERN access cards may bear a bar code transcribing the holder's identification number (the revised version of this subsidiary document to the aforementioned Circular will be published shortly). Relations with the Host States Service http://www.cern.ch/relations/ relations.secretariat@cern.ch Tel. 72848

  16. Code-Switching in a College Mathematics Classroom

    Science.gov (United States)

    Chitera, Nancy

    2009-01-01

    This paper presents the findings that explored from the discourse practices of the mathematics teacher educators in initial teacher training colleges in Malawi. It examines how mathematics teacher educators construct a multilingual classroom and how they view code-switching. The discussion is based on pre-observation interviews with four…

  17. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  18. A surface definition code for turbine blade surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Yang, S L [Michigan Technological Univ., Houghton, MI (United States); Oryang, D; Ho, M J [Tuskegee Univ., AL (United States)

    1992-05-01

    A numerical interpolation scheme has been developed for generating the three-dimensional geometry of wind turbine blades. The numerical scheme consists of (1) creating the frame of the blade through the input of two or more airfoils at some specific spanwise stations and then scaling and twisting them according to the prescribed distributions of chord, thickness, and twist along the span of the blade; (2) transforming the physical coordinates of the blade frame into a computational domain that complies with the interpolation requirements; and finally (3) applying the bi-tension spline interpolation method, in the computational domain, to determine the coordinates of any point on the blade surface. Detailed descriptions of the overall approach to and philosophy of the code development are given along with the operation of the code. To show the usefulness of the bi-tension spline interpolation code developed, two examples are given, namely CARTER and MICON blade surface generation. Numerical results are presented in both graphic data forms. The solutions obtained in this work show that the computer code developed can be a powerful tool for generating the surface coordinates for any three-dimensional blade.

  19. A Secure Network Coding Based on Broadcast Encryption in SDN

    Directory of Open Access Journals (Sweden)

    Yue Chen

    2016-01-01

    Full Text Available By allowing intermediate nodes to encode the received packets before sending them out, network coding improves the capacity and robustness of multicast applications. But it is vulnerable to the pollution attacks. Some signature schemes were proposed to thwart such attacks, but most of them need to be homomorphic that the keys cannot be generated and managed easily. In this paper, we propose a novel fast and secure switch network coding multicast (SSNC on the software defined networks (SDN. In our scheme, the complicated secure multicast management was separated from the fast data transmission based on the SDN. Multiple multicasts will be aggregated to one multicast group according to the requirements of services and the network status. Then, the controller will route aggregated multicast group with network coding; only the trusted switch will be allowed to join the network coding by using broadcast encryption. The proposed scheme can use the traditional cryptography without homomorphy, which greatly reduces the complexity of the computation and improves the efficiency of transmission.

  20. Convolutional-Code-Specific CRC Code Design

    OpenAIRE

    Lou, Chung-Yu; Daneshrad, Babak; Wesel, Richard D.

    2015-01-01

    Cyclic redundancy check (CRC) codes check if a codeword is correctly received. This paper presents an algorithm to design CRC codes that are optimized for the code-specific error behavior of a specified feedforward convolutional code. The algorithm utilizes two distinct approaches to computing undetected error probability of a CRC code used with a specific convolutional code. The first approach enumerates the error patterns of the convolutional code and tests if each of them is detectable. Th...

  1. Code switching: a variation in language use | Ifechelobi ...

    African Journals Online (AJOL)

    In such cases, speakers are bound to code mix or code switch in their language use. Traditionally, practices of code switching and code-mixing are viewed negatively. Some see them as “evidences of internal mental confusion'' and some as manifestations of language competence deficiencies. This paper therefore sets out ...

  2. List decoding of a class of affine variety codes

    CERN Document Server

    Geil, Olav

    2011-01-01

    Consider a polynomial $F$ in $m$ variables and a finite point ensemble $S=S_1 \\times ... \\times S_m$. When given the leading monomial of $F$ with respect to a lexicographic ordering we derive improved information on the possible number of zeros of $F$ of multiplicity at least $r$ from $S$. We then use this information to design a list decoding algorithm for a large class of affine variety codes.

  3. A predictive transport modeling code for ICRF-heated tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, C.K.; Hwang, D.Q. [Princeton Univ., NJ (United States). Plasma Physics Lab.; Houlberg, W.; Attenberger, S.; Tolliver, J.; Hively, L. [Oak Ridge National Lab., TN (United States)

    1992-02-01

    In this report, a detailed description of the physic included in the WHIST/RAZE package as well as a few illustrative examples of the capabilities of the package will be presented. An in depth analysis of ICRF heating experiments using WHIST/RAZE will be discussed in a forthcoming report. A general overview of philosophy behind the structure of the WHIST/RAZE package, a summary of the features of the WHIST code, and a description of the interface to the RAZE subroutines are presented in section 2 of this report. Details of the physics contained in the RAZE code are examined in section 3. Sample results from the package follow in section 4, with concluding remarks and a discussion of possible improvements to the package discussed in section 5.

  4. A Plastic Temporal Brain Code for Conscious State Generation

    Directory of Open Access Journals (Sweden)

    Birgitta Dresp-Langley

    2009-01-01

    Full Text Available Consciousness is known to be limited in processing capacity and often described in terms of a unique processing stream across a single dimension: time. In this paper, we discuss a purely temporal pattern code, functionally decoupled from spatial signals, for conscious state generation in the brain. Arguments in favour of such a code include Dehaene et al.'s long-distance reverberation postulate, Ramachandran's remapping hypothesis, evidence for a temporal coherence index and coincidence detectors, and Grossberg's Adaptive Resonance Theory. A time-bin resonance model is developed, where temporal signatures of conscious states are generated on the basis of signal reverberation across large distances in highly plastic neural circuits. The temporal signatures are delivered by neural activity patterns which, beyond a certain statistical threshold, activate, maintain, and terminate a conscious brain state like a bar code would activate, maintain, or inactivate the electronic locks of a safe. Such temporal resonance would reflect a higher level of neural processing, independent from sensorial or perceptual brain mechanisms.

  5. CHOLLA: A New Massively Parallel Hydrodynamics Code for Astrophysical Simulation

    Science.gov (United States)

    Schneider, Evan E.; Robertson, Brant E.

    2015-04-01

    We present Computational Hydrodynamics On ParaLLel Architectures (Cholla ), a new three-dimensional hydrodynamics code that harnesses the power of graphics processing units (GPUs) to accelerate astrophysical simulations. Cholla models the Euler equations on a static mesh using state-of-the-art techniques, including the unsplit Corner Transport Upwind algorithm, a variety of exact and approximate Riemann solvers, and multiple spatial reconstruction techniques including the piecewise parabolic method (PPM). Using GPUs, Cholla evolves the fluid properties of thousands of cells simultaneously and can update over 10 million cells per GPU-second while using an exact Riemann solver and PPM reconstruction. Owing to the massively parallel architecture of GPUs and the design of the Cholla code, astrophysical simulations with physically interesting grid resolutions (≳2563) can easily be computed on a single device. We use the Message Passing Interface library to extend calculations onto multiple devices and demonstrate nearly ideal scaling beyond 64 GPUs. A suite of test problems highlights the physical accuracy of our modeling and provides a useful comparison to other codes. We then use Cholla to simulate the interaction of a shock wave with a gas cloud in the interstellar medium, showing that the evolution of the cloud is highly dependent on its density structure. We reconcile the computed mixing time of a turbulent cloud with a realistic density distribution destroyed by a strong shock with the existing analytic theory for spherical cloud destruction by describing the system in terms of its median gas density.

  6. Sharing code.

    Science.gov (United States)

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  7. Petascale electronic structure code with a new parallel eigensolver

    Science.gov (United States)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Li, Yan; Kelley, Ct; Bernholc, Jerzy

    2015-03-01

    We describe recent developments within the Real Space Multigrid (RMG) electronic structure code. RMG uses real-space grids, a multigrid pre-conditioner, and subspace diagonalization to solve the Kohn-Sham equations. It is designed for use on massively parallel computers and has shown excellent scalability and performance, reaching 6.5 PFLOPS on 18k Cray compute nodes with 288k CPU cores and 18k GPUs. For large problems, the diagonalization becomes computationally dominant and a novel, highly parallel eigensolver was developed that makes efficient use of a large number of nodes. Test results for a range of problem sizes are presented, which execute up to 3.5 times faster than standard eigensolvers such as Scalapack. RMG is now an open source code, running on Linux, Windows and MacIntosh systems. It may be downloaded at .

  8. ALOHA Random Access that Operates as a Rateless Code

    DEFF Research Database (Denmark)

    Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    Various applications of wireless Machine-to-Machine (M2M) communications have rekindled the research interest in random access protocols, suitable to support a large number of connected devices. Slotted ALOHA and its derivatives represent a simple solution for distributed random access in wireless...... networks. Recently, a framed version of slotted ALOHA gained renewed interest due to the incorporation of successive interference cancellation (SIC) in the scheme, which resulted in substantially higher throughputs. Based on similar principles and inspired by the rateless coding paradigm, a frameless...... approach for distributed random access in the slotted ALOHA framework is described in this paper. The proposed approach shares an operational analogy with rateless coding, expressed both through the user access strategy and the adaptive length of the contention period, with the objective to end...

  9. Improving a Power Line Communications Standard with LDPC Codes

    Directory of Open Access Journals (Sweden)

    Praveen Jain

    2007-01-01

    Full Text Available We investigate a power line communications (PLC scheme that could be used to enhance the HomePlug 1.0 standard, specifically its ROBO mode which provides modest throughput for the worst case PLC channel. The scheme is based on using a low-density parity-check (LDPC code, in lieu of the concatenated Reed-Solomon and convolutional codes in ROBO mode. The PLC channel is modeled with multipath fading and Middleton's class A noise. Clipping is introduced to mitigate the effect of impulsive noise. A simple and effective method is devised to estimate the variance of the clipped noise for LDPC decoding. Simulation results show that the proposed scheme outperforms the HomePlug 1.0 ROBO mode and has lower computational complexity. The proposed scheme also dispenses with the repetition of information bits in ROBO mode to gain time diversity, resulting in 4-fold increase in physical layer throughput.

  10. The ICPC coding system in pharmacy : developing a subset, ICPC-Ph

    NARCIS (Netherlands)

    van Mil, JWF; Brenninkmeijer, R; Tromp, TFJ

    The ICPC system is a coding system developed for general medical practice, to be able to code the GP-patient encounters and other actions. Some of the codes can be easily used by community pharmacists to code complaints and diseases in pharmaceutical care practice. We developed a subset of the ICPC

  11. The 2010 fib Model Code for Structural Concrete: A new approach to structural engineering

    NARCIS (Netherlands)

    Walraven, J.C.; Bigaj-Van Vliet, A.

    2011-01-01

    The fib Model Code is a recommendation for the design of reinforced and prestressed concrete which is intended to be a guiding document for future codes. Model Codes have been published before, in 1978 and 1990. The draft for fib Model Code 2010 was published in May 2010. The most important new

  12. Magnus: A New Resistive MHD Code with Heat Flow Terms

    Science.gov (United States)

    Navarro, Anamaría; Lora-Clavijo, F. D.; González, Guillermo A.

    2017-07-01

    We present a new magnetohydrodynamic (MHD) code for the simulation of wave propagation in the solar atmosphere, under the effects of electrical resistivity—but not dominant—and heat transference in a uniform 3D grid. The code is based on the finite-volume method combined with the HLLE and HLLC approximate Riemann solvers, which use different slope limiters like MINMOD, MC, and WENO5. In order to control the growth of the divergence of the magnetic field, due to numerical errors, we apply the Flux Constrained Transport method, which is described in detail to understand how the resistive terms are included in the algorithm. In our results, it is verified that this method preserves the divergence of the magnetic fields within the machine round-off error (˜ 1× {10}-12). For the validation of the accuracy and efficiency of the schemes implemented in the code, we present some numerical tests in 1D and 2D for the ideal MHD. Later, we show one test for the resistivity in a magnetic reconnection process and one for the thermal conduction, where the temperature is advected by the magnetic field lines. Moreover, we display two numerical problems associated with the MHD wave propagation. The first one corresponds to a 3D evolution of a vertical velocity pulse at the photosphere-transition-corona region, while the second one consists of a 2D simulation of a transverse velocity pulse in a coronal loop.

  13. Prodeto, a computer code for probabilistic fatigue design

    Energy Technology Data Exchange (ETDEWEB)

    Braam, H. [ECN-Solar and Wind Energy, Petten (Netherlands); Christensen, C.J.; Thoegersen, M.L. [Risoe National Lab., Roskilde (Denmark); Ronold, K.O. [Det Norske Veritas, Hoevik (Norway)

    1999-03-01

    A computer code for structural relibility analyses of wind turbine rotor blades subjected to fatigue loading is presented. With pre-processors that can transform measured and theoretically predicted load series to load range distributions by rain-flow counting and with a family of generic distribution models for parametric representation of these distribution this computer program is available for carying through probabilistic fatigue analyses of rotor blades. (au)

  14. On the Feasibility of a Network Coded Mobile Storage Cloud

    DEFF Research Database (Denmark)

    Sipos, Marton; H. P. Fitzek, Frank; Lucani Rötter, Daniel Enrique

    2015-01-01

    Conventional cloud storage services offer relatively good reliability and performance in a cost-effective manner. However, they are typically structured in a centralized and highly controlled fashion. In more dynamic storage scenarios, these centralized approaches are unfeasible and developing...... decentralized storage approaches becomes critical. The novelty of this paper is the introduction of the highly dynamic distributed mobile cloud, which uses free resources on user devices to move storage to the edges of the network. At the core of our approach, lies the use of random linear network coding...... to provide an effective and flexible erasure correcting code. This paper identifies and answers key questions regarding the feasibility of such a system. We show that the mobile cloud has sufficient network resources to adapt to changes in node numbers and also study the redundancy level needed to maintain...

  15. A database of linear codes over F_13 with minimum distance bounds and new quasi-twisted codes from a heuristic search algorithm

    Directory of Open Access Journals (Sweden)

    Eric Z. Chen

    2015-01-01

    Full Text Available Error control codes have been widely used in data communications and storage systems. One central problem in coding theory is to optimize the parameters of a linear code and construct codes with best possible parameters. There are tables of best-known linear codes over finite fields of sizes up to 9. Recently, there has been a growing interest in codes over $\\mathbb{F}_{13}$ and other fields of size greater than 9. The main purpose of this work is to present a database of best-known linear codes over the field $\\mathbb{F}_{13}$ together with upper bounds on the minimum distances. To find good linear codes to establish lower bounds on minimum distances, an iterative heuristic computer search algorithm is employed to construct quasi-twisted (QT codes over the field $\\mathbb{F}_{13}$ with high minimum distances. A large number of new linear codes have been found, improving previously best-known results. Tables of $[pm, m]$ QT codes over $\\mathbb{F}_{13}$ with best-known minimum distances as well as a table of lower and upper bounds on the minimum distances for linear codes of length up to 150 and dimension up to 6 are presented.

  16. ANNA: A Convolutional Neural Network Code for Spectroscopic Analysis

    Science.gov (United States)

    Lee-Brown, Donald; Anthony-Twarog, Barbara J.; Twarog, Bruce A.

    2018-01-01

    We present ANNA, a Python-based convolutional neural network code for the automated analysis of stellar spectra. ANNA provides a flexible framework that allows atmospheric parameters such as temperature and metallicity to be determined with accuracies comparable to those of established but less efficient techniques. ANNA performs its parameterization extremely quickly; typically several thousand spectra can be analyzed in less than a second. Additionally, the code incorporates features which greatly speed up the training process necessary for the neural network to measure spectra accurately, resulting in a tool that can easily be run on a single desktop or laptop computer. Thus, ANNA is useful in an era when spectrographs increasingly have the capability to collect dozens to hundreds of spectra each night. This talk will cover the basic features included in ANNA and demonstrate its performance in two use cases: an open cluster abundance analysis involving several hundred spectra, and a metal-rich field star study. Applicability of the code to large survey datasets will also be discussed.

  17. The genetic code as a periodic table: algebraic aspects.

    Science.gov (United States)

    Bashford, J D; Jarvis, P D

    2000-01-01

    The systematics of indices of physico-chemical properties of codons and amino acids across the genetic code are examined. Using a simple numerical labelling scheme for nucleic acid bases, A=(-1,0), C=(0,-1), G=(0,1), U=(1,0), data can be fitted as low order polynomials of the six coordinates in the 64-dimensional codon weight space. The work confirms and extends the recent studies by Siemion et al. (1995. BioSystems 36, 231-238) of the conformational parameters. Fundamental patterns in the data such as codon periodicities, and related harmonics and reflection symmetries, are here associated with the structure of the set of basis monomials chosen for fitting. Results are plotted using the Siemion one-step mutation ring scheme, and variants thereof. The connections between the present work, and recent studies of the genetic code structure using dynamical symmetry algebras, are pointed out.

  18. A CFD code comparison of wind turbine wakes

    DEFF Research Database (Denmark)

    Laan, van der, Paul Maarten; Storey, R. C.; Sørensen, Niels N.

    2014-01-01

    A comparison is made between the EllipSys3D and SnS CFD codes. Both codes are used to perform Large-Eddy Simulations (LES) of single wind turbine wakes, using the actuator disk method. The comparison shows that both LES models predict similar velocity deficits and stream-wise Reynolds...... simulations using EllipSys3D for a test case that is based on field measurements. In these simulations, two eddy viscosity turbulence models are employed: the k- (ε) model and the k- (ε)-fp model. Where the k- (ε) model fails to predict the velocity deficit, the results of the k- (ε)-fP model show good...

  19. DNA as a Binary Code: How the Physical Structure of Nucleotide Bases Carries Information

    Science.gov (United States)

    McCallister, Gary

    2005-01-01

    The DNA triplet code also functions as a binary code. Because double-ring compounds cannot bind to double-ring compounds in the DNA code, the sequence of bases classified simply as purines or pyrimidines can encode for smaller groups of possible amino acids. This is an intuitive approach to teaching the DNA code. (Contains 6 figures.)

  20. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    Directory of Open Access Journals (Sweden)

    Tracy Waize

    2007-09-01

    Conclusions Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  1. 76 FR 39039 - Establishment of a New Drug Code for Marihuana Extract

    Science.gov (United States)

    2011-07-05

    ... Enforcement Administration 21 CFR Part 1308 RIN 1117-AB33 Establishment of a New Drug Code for Marihuana... Controlled Substances Code Number (``Code Number'' or ``drug code'') under 21 CFR 1308.11 for ``Marihuana... material separately from quantities of marihuana. This in turn will aid in complying with relevant treaty...

  2. Machine Code Verification of a Tiny ARM Hypervisor

    OpenAIRE

    Dam, Mads; Guanciale, Roberto; Nemati, Hamed

    2013-01-01

    Hypervisors are low level execution platforms that provideisolated partitions on shared resources, allowing to design se-cure systems without using dedicated hardware devices. Akey requirement of this kind of solution is the formal verifi-cation of the software trusted computing base, preferably atthe binary level. We accomplish a detailed verification of anARMv7 tiny hypervisor, proving its correctness at the ma-chine code level. We present our verification strategy, whichmixes the usage of ...

  3. A Radiation Solver for the National Combustion Code

    Science.gov (United States)

    Sockol, Peter M.

    2015-01-01

    A methodology is given that converts an existing finite volume radiative transfer method that requires input of local absorption coefficients to one that can treat a mixture of combustion gases and compute the coefficients on the fly from the local mixture properties. The Full-spectrum k-distribution method is used to transform the radiative transfer equation (RTE) to an alternate wave number variable, g . The coefficients in the transformed equation are calculated at discrete temperatures and participating species mole fractions that span the values of the problem for each value of g. These results are stored in a table and interpolation is used to find the coefficients at every cell in the field. Finally, the transformed RTE is solved for each g and Gaussian quadrature is used to find the radiant heat flux throughout the field. The present implementation is in an existing cartesian/cylindrical grid radiative transfer code and the local mixture properties are given by a solution of the National Combustion Code (NCC) on the same grid. Based on this work the intention is to apply this method to an existing unstructured grid radiation code which can then be coupled directly to NCC.

  4. DMC (Distinct Motion Code): A rigid body motion code for determining the interaction of multiple spherical particles

    Science.gov (United States)

    Taylor, L. M.; Preece, D. S.

    1989-07-01

    The computer program Distinct Motion Code (DMC) determines the two-dimensional planar rigid body motion of an arbitrary number of spherical shaped particles. The code uses an explicit central difference time integration algorithm to calculate the motion of the particles. Contact constraints between the particles are enforced using the penalty method. Coulomb friction and viscous damping are included in the collisions. The explicit time integration is conditionally stable with a time increment size which is dependent on the mass of the smallest particle in the mesh and the penalty stiffness used for the contact forces. The code chooses the spring stiffness based on the Young's modulus and Poisson's ratio of the material. The ability to tie spheres in pairs with a constraint condition is included in the code. The code has been written in an extremely efficient manner with particular emphasis placed on vector processing. While this does not impose any restrictions on non-vector processing computers, it does provide extremely fast results on vector processing computers. A bucket sorting or boxing algorithm is used to reduce the number of comparisons which must be made between spheres to determine the contact pairs. The sorting algorithm is completely algebraic and contains no logical branching.

  5. Evaluation of coded aperture radiation detectors using a Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Kyle, E-mail: mille856@andrew.cmu.edu [Auton Lab, The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Huggins, Peter [Auton Lab, The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Labov, Simon; Nelson, Karl [Lawrence Livermore National Laboratory, Livermore, CA (United States); Dubrawski, Artur [Auton Lab, The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States)

    2016-12-11

    We investigate tradeoffs arising from the use of coded aperture gamma-ray spectrometry to detect and localize sources of harmful radiation in the presence of noisy background. Using an example application scenario of area monitoring and search, we empirically evaluate weakly supervised spectral, spatial, and hybrid spatio-spectral algorithms for scoring individual observations, and two alternative methods of fusing evidence obtained from multiple observations. Results of our experiments confirm the intuition that directional information provided by spectrometers masked with coded aperture enables gains in source localization accuracy, but at the expense of reduced probability of detection. Losses in detection performance can however be to a substantial extent reclaimed by using our new spatial and spatio-spectral scoring methods which rely on realistic assumptions regarding masking and its impact on measured photon distributions.

  6. Tartarus: A relativistic Green's function quantum average atom code

    Science.gov (United States)

    Gill, N. M.; Starrett, C. E.

    2017-09-01

    A relativistic Green's Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green's Function average atom model described by Starrett [1]. The Green's Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparison to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. We also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.

  7. DANTSYS: A diffusion accelerated neutral particle transport code system

    Energy Technology Data Exchange (ETDEWEB)

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O`Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZ{Theta} symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing.

  8. 2D Implosion Simulations with a Kinetic Particle Code

    CERN Document Server

    Sagert, Irina; Strother, Terrance T

    2016-01-01

    We perform two-dimensional (2D) implosion simulations using a Monte Carlo kinetic particle code. The paper is motivated by the importance of non-equilibrium effects in inertial confinement fusion (ICF) capsule implosions. These cannot be fully captured by hydrodynamic simulations while kinetic methods, as the one presented in this study, are able to describe continuum and rarefied regimes within one approach. In the past, our code has been verified via traditional shock wave and fluid instability simulations. In the present work, we focus on setups that are closer to applications in ICF. We perform simple 2D disk implosion simulations using one particle species. The obtained results are compared to simulations using the hydrodynamics code RAGE. In a first study, the implosions are powered by energy deposition in the outer layers of the disk. We test the impact of the particle mean-free-path and find that while the width of the implosion shock broadens, its location as a function of time remains very similar. ...

  9. Concatenated codes with convolutional inner codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Zyablov, Viktor

    1988-01-01

    The minimum distance of concatenated codes with Reed-Solomon outer codes and convolutional inner codes is studied. For suitable combinations of parameters the minimum distance can be lower-bounded by the product of the minimum distances of the inner and outer codes. For a randomized ensemble...... of concatenated codes a lower bound of the Gilbert-Varshamov type is proved...

  10. Development of a code clone search tool for open source repositories

    OpenAIRE

    Xia, Pei; Yoshida, Norihiro; Manabe, Yuki; Inoue, Katsuro

    2011-01-01

    Finding code clones in the open source systems is one of important and demanding features for efficient and safe reuse of existing open source software. In this paper, we propose a novel search model, open code clone search, to explore code clones in open source repositories on the Internet. Based on this search model, we have designed and implemented a prototype system named Open CCFinder. This system takes a query code fragment as its input, and returns the code fragments containing the cod...

  11. A New Method Of Gene Coding For A Genetic Algorithm Designed For Parametric Optimization

    Directory of Open Access Journals (Sweden)

    Radu BELEA

    2003-12-01

    Full Text Available In a parametric optimization problem the genes code the real parameters of the fitness function. There are two coding techniques known under the names of: binary coded genes and real coded genes. The comparison between these two is a controversial subject since the first papers about parametric optimization have appeared. An objective analysis regarding the advantages and disadvantages of the two coding techniques is difficult to be done while different format information is compared. The present paper suggests a gene coding technique that uses the same format for both binary coded genes and for the real coded genes. After unifying the real parameters representation, the next criterion is going to be applied: the differences between the two techniques are statistically measured by the effect of the genetic operators over some random generated fellows.

  12. A Cooperative Downloading Method for VANET Using Distributed Fountain Code.

    Science.gov (United States)

    Liu, Jianhang; Zhang, Wenbin; Wang, Qi; Li, Shibao; Chen, Haihua; Cui, Xuerong; Sun, Yi

    2016-10-12

    Cooperative downloading is one of the effective methods to improve the amount of downloaded data in vehicular ad hoc networking (VANET). However, the poor channel quality and short encounter time bring about a high packet loss rate, which decreases transmission efficiency and fails to satisfy the requirement of high quality of service (QoS) for some applications. Digital fountain code (DFC) can be utilized in the field of wireless communication to increase transmission efficiency. For cooperative forwarding, however, processing delay from frequent coding and decoding as well as single feedback mechanism using DFC cannot adapt to the environment of VANET. In this paper, a cooperative downloading method for VANET using concatenated DFC is proposed to solve the problems above. The source vehicle and cooperative vehicles encodes the raw data using hierarchical fountain code before they send to the client directly or indirectly. Although some packets may be lost, the client can recover the raw data, so long as it receives enough encoded packets. The method avoids data retransmission due to packet loss. Furthermore, the concatenated feedback mechanism in the method reduces the transmission delay effectively. Simulation results indicate the benefits of the proposed scheme in terms of increasing amount of downloaded data and data receiving rate.

  13. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  14. The "periodic table" of the genetic code: A new way to look at the code and the decoding process.

    Science.gov (United States)

    Komar, Anton A

    2016-01-01

    Henri Grosjean and Eric Westhof recently presented an information-rich, alternative view of the genetic code, which takes into account current knowledge of the decoding process, including the complex nature of interactions between mRNA, tRNA and rRNA that take place during protein synthesis on the ribosome, and it also better reflects the evolution of the code. The new asymmetrical circular genetic code has a number of advantages over the traditional codon table and the previous circular diagrams (with a symmetrical/clockwise arrangement of the U, C, A, G bases). Most importantly, all sequence co-variances can be visualized and explained based on the internal logic of the thermodynamics of codon-anticodon interactions.

  15. A new neutron energy spectrum unfolding code using a two steps genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Shahabinejad, H., E-mail: shahabinejad1367@yahoo.com; Hosseini, S.A.; Sohrabpour, M.

    2016-03-01

    A new neutron spectrum unfolding code TGASU (Two-steps Genetic Algorithm Spectrum Unfolding) has been developed to unfold the neutron spectrum from a pulse height distribution which was calculated using the MCNPX-ESUT computational Monte Carlo code. To perform the unfolding process, the response matrices were generated using the MCNPX-ESUT computational code. Both one step (common GA) and two steps GAs have been implemented to unfold the neutron spectra. According to the obtained results, the new two steps GA code results has shown closer match in all energy regions and particularly in the high energy regions. The results of the TGASU code have been compared with those of the standard spectra, LSQR method and GAMCD code. The results of the TGASU code have been demonstrated to be more accurate than that of the existing computational codes for both under-determined and over-determined problems.

  16. Porting of a serial molecular dynamics code on MIMD platforms

    Energy Technology Data Exchange (ETDEWEB)

    Celino, M. [ENEA Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). HPCN Project

    1999-07-01

    A molecular dynamics (MD) code, utilized for the study of atomistic models of metallic systems has been parallelized for MIMD (multiple instructions multiple data) parallel platforms by means of the parallel virtual machine (PVM) message passing library. Since the parallelization implies modifications of the sequential algorithms, these are described from the point of view of the statistical mechanical theory. Furthermore, techniques and parallelization strategies utilized and the MD parallel code are described in detail. Benchmarks on several MIMD platforms (IBM SP1, SP2, Cray T3D, cluster of workstations) allow performances evaluation of the code versus the different characteristics of the parallel platforms. [Italian] Un codice seriale di dinamica molecolare (MD) utilizzato per lo studio di modelli atomici di materiali metallici e' stato parallelizzato per piattaforme parallele MIMD (multiple instructions multiple data) utilizzando librerie del parallel virtual machine (PVM). Poiche' l'operazione di parallelizzazione ha implicato la modifica degli algoritmi seriali del codice, questi vengono descritti ripercorrendo i concetti fondamentali della meccanica statistica. Inoltre sono presentate le tecniche e le strategie di parallelizzazione utilizzate descrivendo in dettaglio il codice parallelo di MD: Risultati di benchmark su diverse piattaforme MIMD (IBM SP1, SP2, Cray T3D, cluster of workstations) permettono di analizzare le performances del codice in funzione delle differenti caratteristiche delle piattaforme parallele.

  17. Numerical simulations of hydrodynamic instabilities: perturbation codes Pansy, Perle, and 2D code Chic applied to a realistic LIL target

    Energy Technology Data Exchange (ETDEWEB)

    Hallo, L.; Olazabal-Loume, M.; Maire, P.H.; Breil, J.; Schurtz, G. [CELIA, 33 - Talence (France); Morse, R.L. [Arizona Univ., Dept. of Nuclear Engineering, Tucson (United States)

    2006-06-15

    This paper deals with ablation front instabilities simulations in the context of direct drive inertial confinement fusion. A simplified deuterium-tritium target, representative of realistic target on LIL (laser integration line at Megajoule laser facility) is considered. We describe here two numerical approaches: the linear perturbation method using the perturbation codes Perle (planar) and Pansy (spherical) and the direct simulation method using our bi-dimensional hydrodynamic code Chic. Our work shows a good behaviour of all methods even for large wavenumbers during the acceleration phase of the ablation front. We also point out a good agreement between model and numerical predictions at ablation front during the shock wave transit.

  18. Divergence coding for convolutional codes

    Directory of Open Access Journals (Sweden)

    Valery Zolotarev

    2017-01-01

    Full Text Available In the paper we propose a new coding/decoding on the divergence principle. A new divergent multithreshold decoder (MTD for convolutional self-orthogonal codes contains two threshold elements. The second threshold element decodes the code with the code distance one greater than for the first threshold element. Errorcorrecting possibility of the new MTD modification have been higher than traditional MTD. Simulation results show that the performance of the divergent schemes allow to approach area of its effective work to channel capacity approximately on 0,5 dB. Note that we include the enough effective Viterbi decoder instead of the first threshold element, the divergence principle can reach more. Index Terms — error-correcting coding, convolutional code, decoder, multithreshold decoder, Viterbi algorithm.

  19. A novel method of generating and remembering international morse codes

    Digital Repository Service at National Institute of Oceanography (India)

    Charyulu, R.J.K.

    untethered communications have been advanced, despite as S.O.S International Morse Code will be at rescue as an emergency tool, when all other modes fail The details of hte method and actual codes have been enumerated....

  20. 28 CFR Appendix A to Part 812 - Qualifying District of Columbia Code Offenses

    Science.gov (United States)

    2010-07-01

    ... the District of Columbia (kidnapping); (16) Section 798 of An Act To establish a code of law for the...-2012—sexual performances using minors; (15) D.C. Code section 22-2101—kidnapping; (16) D.C. Code...—kidnapping; (14) D.C. Code section 22-2101—murder in the first degree; (15) D.C. Code section 22-2102—murder...

  1. A primer on physical-layer network coding

    CERN Document Server

    Liew, Soung Chang; Zhang, Shengli

    2015-01-01

    The concept of physical-layer network coding (PNC) was proposed in 2006 for application in wireless networks. Since then it has developed into a subfield of communications and networking with a wide following. This book is a primer on PNC. It is the outcome of a set of lecture notes for a course for beginning graduate students at The Chinese University of Hong Kong. The target audience is expected to have some prior background knowledge in communication theory and wireless communications, but not working knowledge at the research level. Indeed, a goal of this book/course is to allow the reader

  2. Surface code error correction on a defective lattice

    Science.gov (United States)

    Nagayama, Shota; Fowler, Austin G.; Horsman, Dominic; Devitt, Simon J.; Van Meter, Rodney

    2017-02-01

    The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is ∼ 0.3 % , and a code distance of seven suppresses the residual error rate below the original error rate at p=0.1 % . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

  3. A computer code for analysis of severe accidents in LWRs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The ICARE2 computer code, developed and validated since 1988 at IPSN (nuclear safety and protection institute), calculates in a mechanistic way the physical and chemical phenomena involved in the core degradation process during possible severe accidents in LWR's. The coupling between ICARE2 and the best-estimate thermal-hydraulics code CATHARE2 was completed at IPSN and led to the release of a first ICARE/CATHARE V1 version in 1999, followed by 2 successive revisions in 2000 and 2001. This documents gathers all the contributions presented at the first international ICARE/CATHARE users'club seminar that took place in November 2001. This seminar was characterized by a high quality and variety of the presentations, showing an increase of reactor applications and user needs in this area (2D/3D aspects, reflooding, corium slumping into the lower head,...). 2 sessions were organized. The first one was dedicated to the applications of ICARE2 V3mod1 against small-scale experiments such as PHEBUS FPT2 and FPT3 tests, PHEBUS AIC, QUENCH experiments, NRU-FLHT-5 test, ACRR-MP1 and DC1 experiments, CORA-PWR tests, and PBF-SFD1.4 test. The second session involved ICARE/CATHARE V1mod1 reactor applications and users'guidelines. Among reactor applications we found: code applicability to high burn-up fuel rods, simulation of the TMI-2 transient, simulation of a PWR-900 high pressure severe accident sequence, and the simulation of a VVER-1000 large break LOCA scenario. (A.C.)

  4. A simple histone code opens many paths to epigenetics.

    Directory of Open Access Journals (Sweden)

    Kim Sneppen

    Full Text Available Nucleosomes can be covalently modified by addition of various chemical groups on several of their exposed histone amino acids. These modifications are added and removed by enzymes (writers and can be recognized by nucleosome-binding proteins (readers. Linking a reader domain and a writer domain that recognize and create the same modification state should allow nucleosomes in a particular modification state to recruit enzymes that create that modification state on nearby nucleosomes. This positive feedback has the potential to provide the alternative stable and heritable states required for epigenetic memory. However, analysis of simple histone codes involving interconversions between only two or three types of modified nucleosomes has revealed only a few circuit designs that allow heritable bistability. Here we show by computer simulations that a histone code involving alternative modifications at two histone positions, producing four modification states, combined with reader-writer proteins able to distinguish these states, allows for hundreds of different circuits capable of heritable bistability. These expanded possibilities result from multiple ways of generating two-step cooperativity in the positive feedback--through alternative pathways and an additional, novel cooperativity motif. Our analysis reveals other properties of such epigenetic circuits. They are most robust when the dominant nucleosome types are different at both modification positions and are not the type inserted after DNA replication. The dominant nucleosome types often recruit enzymes that create their own type or destroy the opposing type, but never catalyze their own destruction. The circuits appear to be evolutionary accessible; most circuits can be changed stepwise into almost any other circuit without losing heritable bistability. Thus, our analysis indicates that systems that utilize an expanded histone code have huge potential for generating stable and heritable

  5. A simple histone code opens many paths to epigenetics.

    Science.gov (United States)

    Sneppen, Kim; Dodd, Ian B

    2012-01-01

    Nucleosomes can be covalently modified by addition of various chemical groups on several of their exposed histone amino acids. These modifications are added and removed by enzymes (writers) and can be recognized by nucleosome-binding proteins (readers). Linking a reader domain and a writer domain that recognize and create the same modification state should allow nucleosomes in a particular modification state to recruit enzymes that create that modification state on nearby nucleosomes. This positive feedback has the potential to provide the alternative stable and heritable states required for epigenetic memory. However, analysis of simple histone codes involving interconversions between only two or three types of modified nucleosomes has revealed only a few circuit designs that allow heritable bistability. Here we show by computer simulations that a histone code involving alternative modifications at two histone positions, producing four modification states, combined with reader-writer proteins able to distinguish these states, allows for hundreds of different circuits capable of heritable bistability. These expanded possibilities result from multiple ways of generating two-step cooperativity in the positive feedback--through alternative pathways and an additional, novel cooperativity motif. Our analysis reveals other properties of such epigenetic circuits. They are most robust when the dominant nucleosome types are different at both modification positions and are not the type inserted after DNA replication. The dominant nucleosome types often recruit enzymes that create their own type or destroy the opposing type, but never catalyze their own destruction. The circuits appear to be evolutionary accessible; most circuits can be changed stepwise into almost any other circuit without losing heritable bistability. Thus, our analysis indicates that systems that utilize an expanded histone code have huge potential for generating stable and heritable nucleosome

  6. Using a rapidly identifiable access code system in the OR.

    Science.gov (United States)

    Kastner, D G; Weingarten, L

    1987-01-01

    To ensure that this system works, each staff member makes an entry into the computer or on the master chart when an item is taken from the supply area. He or she is also expected to check for outdated instrumentation and proper placement on the shelf. The coding system has increased the staff's organization and productivity. It has been successful because it uses a numerical system instead of a memory-based system, and because all instrumentation are categorized and stored according to specialty. The simplicity of the system that allows for quicker access to instrumentation also makes it inexpensive to implement.

  7. Performance of sparse graph codes on a four-dimensional CDMA System in AWGN and multipath fading

    CSIR Research Space (South Africa)

    Vlok, JD

    2007-09-01

    Full Text Available ) communication platform. The channel codes include a 3D block-turbo-code (BTC) with extended Reed-Muller (RM) constituent codes, low-density parity-check (LDPC) codes and repeat-accumulate (RA) codes. It is shown that the three channel codes have comparable error...

  8. RAM: a Relativistic Adaptive Mesh Refinement Hydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wei-Qun; /KIPAC, Menlo Park; MacFadyen, Andrew I.; /Princeton, Inst. Advanced Study

    2005-06-06

    The authors have developed a new computer code, RAM, to solve the conservative equations of special relativistic hydrodynamics (SRHD) using adaptive mesh refinement (AMR) on parallel computers. They have implemented a characteristic-wise, finite difference, weighted essentially non-oscillatory (WENO) scheme using the full characteristic decomposition of the SRHD equations to achieve fifth-order accuracy in space. For time integration they use the method of lines with a third-order total variation diminishing (TVD) Runge-Kutta scheme. They have also implemented fourth and fifth order Runge-Kutta time integration schemes for comparison. The implementation of AMR and parallelization is based on the FLASH code. RAM is modular and includes the capability to easily swap hydrodynamics solvers, reconstruction methods and physics modules. In addition to WENO they have implemented a finite volume module with the piecewise parabolic method (PPM) for reconstruction and the modified Marquina approximate Riemann solver to work with TVD Runge-Kutta time integration. They examine the difficulty of accurately simulating shear flows in numerical relativistic hydrodynamics codes. They show that under-resolved simulations of simple test problems with transverse velocity components produce incorrect results and demonstrate the ability of RAM to correctly solve these problems. RAM has been tested in one, two and three dimensions and in Cartesian, cylindrical and spherical coordinates. they have demonstrated fifth-order accuracy for WENO in one and two dimensions and performed detailed comparison with other schemes for which they show significantly lower convergence rates. Extensive testing is presented demonstrating the ability of RAM to address challenging open questions in relativistic astrophysics.

  9. A novel neutron energy spectrum unfolding code using particle swarm optimization

    Science.gov (United States)

    Shahabinejad, H.; Sohrabpour, M.

    2017-07-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code.

  10. BOA, Beam Optics Analyzer A Particle-In-Cell Code

    Energy Technology Data Exchange (ETDEWEB)

    Thuc Bui

    2007-12-06

    The program was tasked with implementing time dependent analysis of charges particles into an existing finite element code with adaptive meshing, called Beam Optics Analyzer (BOA). BOA was initially funded by a DOE Phase II program to use the finite element method with adaptive meshing to track particles in unstructured meshes. It uses modern programming techniques, state-of-the-art data structures, so that new methods, features and capabilities are easily added and maintained. This Phase II program was funded to implement plasma simulations in BOA and extend its capabilities to model thermal electrons, secondary emissions, self magnetic field and implement a more comprehensive post-processing and feature-rich GUI. The program was successful in implementing thermal electrons, secondary emissions, and self magnetic field calculations. The BOA GUI was also upgraded significantly, and CCR is receiving interest from the microwave tube and semiconductor equipment industry for the code. Implementation of PIC analysis was partially successful. Computational resource requirements for modeling more than 2000 particles begin to exceed the capability of most readily available computers. Modern plasma analysis typically requires modeling of approximately 2 million particles or more. The problem is that tracking many particles in an unstructured mesh that is adapting becomes inefficient. In particular memory requirements become excessive. This probably makes particle tracking in unstructured meshes currently unfeasible with commonly available computer resources. Consequently, Calabazas Creek Research, Inc. is exploring hybrid codes where the electromagnetic fields are solved on the unstructured, adaptive mesh while particles are tracked on a fixed mesh. Efficient interpolation routines should be able to transfer information between nodes of the two meshes. If successfully developed, this could provide high accuracy and reasonable computational efficiency.

  11. A Mutation Model from First Principles of the Genetic Code.

    Science.gov (United States)

    Thorvaldsen, Steinar

    2016-01-01

    The paper presents a neutral Codons Probability Mutations (CPM) model of molecular evolution and genetic decay of an organism. The CPM model uses a Markov process with a 20-dimensional state space of probability distributions over amino acids. The transition matrix of the Markov process includes the mutation rate and those single point mutations compatible with the genetic code. This is an alternative to the standard Point Accepted Mutation (PAM) and BLOcks of amino acid SUbstitution Matrix (BLOSUM). Genetic decay is quantified as a similarity between the amino acid distribution of proteins from a (group of) species on one hand, and the equilibrium distribution of the Markov chain on the other. Amino acid data for the eukaryote, bacterium, and archaea families are used to illustrate how both the CPM and PAM models predict their genetic decay towards the equilibrium value of 1. A family of bacteria is studied in more detail. It is found that warm environment organisms on average have a higher degree of genetic decay compared to those species that live in cold environments. The paper addresses a new codon-based approach to quantify genetic decay due to single point mutations compatible with the genetic code. The present work may be seen as a first approach to use codon-based Markov models to study how genetic entropy increases with time in an effectively neutral biological regime. Various extensions of the model are also discussed.

  12. Temporal perceptual coding using a visual acuity model

    Science.gov (United States)

    Adzic, Velibor; Cohen, Robert A.; Vetro, Anthony

    2014-02-01

    This paper describes research and results in which a visual acuity (VA) model of the human visual system (HVS) is used to reduce the bitrate of coded video sequences, by eliminating the need to signal transform coefficients when their corresponding frequencies will not be detected by the HVS. The VA model is integrated into the state of the art HEVC HM codec. Compared to the unmodified codec, up to 45% bitrate savings are achieved while maintaining the same subjective quality of the video sequences. Encoding times are reduced as well.

  13. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech......; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...

  14. FARGO3D: A NEW GPU-ORIENTED MHD CODE

    Energy Technology Data Exchange (ETDEWEB)

    Benitez-Llambay, Pablo [Instituto de Astronomía Teórica y Experimental, Observatorio Astronónomico, Universidad Nacional de Córdoba. Laprida 854, X5000BGR, Córdoba (Argentina); Masset, Frédéric S., E-mail: pbllambay@oac.unc.edu.ar, E-mail: masset@icf.unam.mx [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México (UNAM), Apdo. Postal 48-3,62251-Cuernavaca, Morelos (Mexico)

    2016-03-15

    We present the FARGO3D code, recently publicly released. It is a magnetohydrodynamics code developed with special emphasis on the physics of protoplanetary disks and planet–disk interactions, and parallelized with MPI. The hydrodynamics algorithms are based on finite-difference upwind, dimensionally split methods. The magnetohydrodynamics algorithms consist of the constrained transport method to preserve the divergence-free property of the magnetic field to machine accuracy, coupled to a method of characteristics for the evaluation of electromotive forces and Lorentz forces. Orbital advection is implemented, and an N-body solver is included to simulate planets or stars interacting with the gas. We present our implementation in detail and present a number of widely known tests for comparison purposes. One strength of FARGO3D is that it can run on either graphical processing units (GPUs) or central processing units (CPUs), achieving large speed-up with respect to CPU cores. We describe our implementation choices, which allow a user with no prior knowledge of GPU programming to develop new routines for CPUs, and have them translated automatically for GPUs.

  15. A Network Coding Based Routing Protocol for Underwater Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xin Guan

    2012-04-01

    Full Text Available Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs. Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR.We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime.

  16. A network coding based routing protocol for underwater sensor networks.

    Science.gov (United States)

    Wu, Huayang; Chen, Min; Guan, Xin

    2012-01-01

    Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs). Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR).We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC) comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime.

  17. Analyzing a School Dress Code in a Junior High School: A Set of Exercises.

    Science.gov (United States)

    East, Maurice A.; And Others

    Five exercises based on a sample school dress code were designed from a political science perspective to help students develop skills in analyzing issues. The exercises are intended to be used in five or more class periods. In the first exercise, students read a sample dress code and name groups of people who might have opinions about it. In…

  18. Development of a safety analysis code for molten salt reactors

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Dalin [State Key Laboratory of Multiphase Flow in Power Engineering, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China); School of Nuclear Science and Technology, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China); Qiu Suizheng, E-mail: szqiu@mail.xjtu.edu.c [State Key Laboratory of Multiphase Flow in Power Engineering, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China); School of Nuclear Science and Technology, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China); Su Guanghui [State Key Laboratory of Multiphase Flow in Power Engineering, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China); School of Nuclear Science and Technology, Xi' an Jiaotong University, 28 West Road Xian Ning Street, Xi' an 710049 (China)

    2009-12-15

    The molten salt reactor (MSR) well suited to fulfill the criteria defined by the Generation IV International Forum (GIF) is presently revisited all around the world because of different attractive features of current renewed relevance. The MSRs are characterized by using the fluid-fuel, so that their technologies are fundamentally different from those used in the conventional solid-fuel reactors. In this work, in particular, the attention is focused on the safety characteristic analysis of the MSRs, in which a point kinetic model considering the flow effects of the fuel salt is established for the MSRs and calculated by developing a microcomputer code coupling with a simplified heat transfer model in the core. The founded models and developed code are applied to analyze the safety characteristics of the molten salt actinide recycler and transmuter system (MOSART) by simulating three types of basic transient conditions including the unprotected loss of flow, unprotected overcooling accident and unprotected transient overpower. Some reasonable results are obtained for the MOSART, which show that the MOSART conceptual design is an inherently stable reactor design. The present study provides some valuable information for the research and design of the new generation MSRs.

  19. Easy as Pi: A Network Coding Raspberry Pi Testbed

    Directory of Open Access Journals (Sweden)

    Chres W. Sørensen

    2016-10-01

    Full Text Available In the near future, upcoming communications and storage networks are expected to tolerate major difficulties produced by huge amounts of data being generated from the Internet of Things (IoT. For these types of networks, strategies and mechanisms based on network coding have appeared as an alternative to overcome these difficulties in a holistic manner, e.g., without sacrificing the benefit of a given network metric when improving another. There has been recurrent issues on: (i making large-scale deployments akin to the Internet of Things; (ii assessing and (iii replicating the obtained results in preliminary studies. Therefore, finding testbeds that can deal with large-scale deployments and not lose historic data in order to evaluate these mechanisms are greatly needed and desirable from a research perspective. However, this can be hard to manage, not only due to the inherent costs of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications requiring results replicability.

  20. Does the health of individuals have a mathematical code?

    Directory of Open Access Journals (Sweden)

    Ali Mehrabi Tavana

    2013-01-01

    Full Text Available The definition of health of individuals is well described by the World Health Organization (WHO and other International Health Organizations. Many studies have also been carried out in order to survey the health conditions in different countries based on this definition, therefore, the health condition of every country analyzed by the WHO. In this hypothesis, I would like to explain "whether the health of individuals has a mathematical code or not? If so, the discovery is on the way to examine each individual based on a health profile as well as every nation in the world to find out, what must be carried out on an individual, national, and international level to increase the health rank? The aim of this hypothesis is to bring to your attention and all of the WHO directors and specialist to ask" whether the health of individuals has a mathematical code or not?" If so, the new view must be considered in regard with the health of the world population, which will be discussed in this hypothesis.

  1. A symbiotic liaison between the genetic and epigenetic code

    Directory of Open Access Journals (Sweden)

    Holger eHeyn

    2014-05-01

    Full Text Available With rapid advances in sequencing technologies, we are undergoing a paradigm shift from hypothesis- to data-driven research. Genome-wide profiling efforts gave informative insights into biological processes; however, considering the wealth of variation, the major challenge remains their meaningful interpretation. In particular sequence variation in non-coding contexts is often challenging to interpret. Here, data integration approaches for the identification of functional genetic variability represent a likely solution. Exemplary, functional linkage analysis integrating genotype and expression data determined regulatory quantitative trait loci (QTL and proposed causal relationships. In addition to gene expression, epigenetic regulation and specifically DNA methylation was established as highly valuable surrogate mark for functional variance of the genetic code. Epigenetic modification served as powerful mediator trait to elucidate mechanisms forming phenotypes in health and disease. Particularly, integrative studies of genetic and DNA methylation data yet guided interpretation strategies of risk genotypes, but also proved their value for physiological traits, such as natural human variation and aging. This Perspective seeks to illustrate the power of data integration in the genomic era exemplified by DNA methylation quantitative trait loci (meQTLs. However, the model is further extendable to virtually all traceable molecular traits.

  2. 17 CFR 275.204A-1 - Investment adviser codes of ethics.

    Science.gov (United States)

    2010-04-01

    ... ethics. 275.204A-1 Section 275.204A-1 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... codes of ethics. (a) Adoption of code of ethics. If you are an investment adviser registered or required... enforce a written code of ethics that, at a minimum, includes: (1) A standard (or standards) of business...

  3. Is a genome a codeword of an error-correcting code?

    Directory of Open Access Journals (Sweden)

    Luzinete C B Faria

    Full Text Available Since a genome is a discrete sequence, the elements of which belong to a set of four letters, the question as to whether or not there is an error-correcting code underlying DNA sequences is unavoidable. The most common approach to answering this question is to propose a methodology to verify the existence of such a code. However, none of the methodologies proposed so far, although quite clever, has achieved that goal. In a recent work, we showed that DNA sequences can be identified as codewords in a class of cyclic error-correcting codes known as Hamming codes. In this paper, we show that a complete intron-exon gene, and even a plasmid genome, can be identified as a Hamming code codeword as well. Although this does not constitute a definitive proof that there is an error-correcting code underlying DNA sequences, it is the first evidence in this direction.

  4. Is a genome a codeword of an error-correcting code?

    Science.gov (United States)

    Faria, Luzinete C B; Rocha, Andréa S L; Kleinschmidt, João H; Silva-Filho, Márcio C; Bim, Edson; Herai, Roberto H; Yamagishi, Michel E B; Palazzo, Reginaldo

    2012-01-01

    Since a genome is a discrete sequence, the elements of which belong to a set of four letters, the question as to whether or not there is an error-correcting code underlying DNA sequences is unavoidable. The most common approach to answering this question is to propose a methodology to verify the existence of such a code. However, none of the methodologies proposed so far, although quite clever, has achieved that goal. In a recent work, we showed that DNA sequences can be identified as codewords in a class of cyclic error-correcting codes known as Hamming codes. In this paper, we show that a complete intron-exon gene, and even a plasmid genome, can be identified as a Hamming code codeword as well. Although this does not constitute a definitive proof that there is an error-correcting code underlying DNA sequences, it is the first evidence in this direction.

  5. A unified form of exact-MSR codes via product-matrix frameworks

    KAUST Repository

    Lin, Sian Jheng

    2015-02-01

    Regenerating codes represent a class of block codes applicable for distributed storage systems. The [n, k, d] regenerating code has data recovery capability while possessing arbitrary k out of n code fragments, and supports the capability for code fragment regeneration through the use of other arbitrary d fragments, for k ≤ d ≤ n - 1. Minimum storage regenerating (MSR) codes are a subset of regenerating codes containing the minimal size of each code fragment. The first explicit construction of MSR codes that can perform exact regeneration (named exact-MSR codes) for d ≥ 2k - 2 has been presented via a product-matrix framework. This paper addresses some of the practical issues on the construction of exact-MSR codes. The major contributions of this paper include as follows. A new product-matrix framework is proposed to directly include all feasible exact-MSR codes for d ≥ 2k - 2. The mechanism for a systematic version of exact-MSR code is proposed to minimize the computational complexities for the process of message-symbol remapping. Two practical forms of encoding matrices are presented to reduce the size of the finite field.

  6. Reasoning with Computer Code: a new Mathematical Logic

    Science.gov (United States)

    Pissanetzky, Sergio

    2013-01-01

    A logic is a mathematical model of knowledge used to study how we reason, how we describe the world, and how we infer the conclusions that determine our behavior. The logic presented here is natural. It has been experimentally observed, not designed. It represents knowledge as a causal set, includes a new type of inference based on the minimization of an action functional, and generates its own semantics, making it unnecessary to prescribe one. This logic is suitable for high-level reasoning with computer code, including tasks such as self-programming, objectoriented analysis, refactoring, systems integration, code reuse, and automated programming from sensor-acquired data. A strong theoretical foundation exists for the new logic. The inference derives laws of conservation from the permutation symmetry of the causal set, and calculates the corresponding conserved quantities. The association between symmetries and conservation laws is a fundamental and well-known law of nature and a general principle in modern theoretical Physics. The conserved quantities take the form of a nested hierarchy of invariant partitions of the given set. The logic associates elements of the set and binds them together to form the levels of the hierarchy. It is conjectured that the hierarchy corresponds to the invariant representations that the brain is known to generate. The hierarchies also represent fully object-oriented, self-generated code, that can be directly compiled and executed (when a compiler becomes available), or translated to a suitable programming language. The approach is constructivist because all entities are constructed bottom-up, with the fundamental principles of nature being at the bottom, and their existence is proved by construction. The new logic is mathematically introduced and later discussed in the context of transformations of algorithms and computer programs. We discuss what a full self-programming capability would really mean. We argue that self

  7. New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities

    Science.gov (United States)

    Mceliece, R. J.; Rodemich, E. R.; Rumsey, H., Jr.; Welch, L. R.

    1977-01-01

    An upper bound on the rate of a binary code as a function of minimum code distance (using a Hamming code metric) is arrived at from Delsarte-MacWilliams inequalities. The upper bound so found is asymptotically less than Levenshtein's bound, and a fortiori less than Elias' bound. Appendices review properties of Krawtchouk polynomials and Q-polynomials utilized in the rigorous proofs.

  8. Dependability Aspects Regarding the Cache Level of a Memory Hierarchy using Hamming Codes

    Science.gov (United States)

    Novac, O.; Vari-Kakas, St.; Novac, Mihaela; Vladu, Ecaterina; Indrie, Liliana

    In this paper we will apply a SEC-DED code to the cache level of a memory hierarchy. From the category of SEC-DED (Single Error Correction Double Error Detection) codes we select the Hamming code. For correction of single-bit error we use a syndrome decoder, a syndrome generator and the check bits generator circuit.

  9. Chirality in a quaternionic representation of the genetic code.

    Science.gov (United States)

    Manuel Carlevaro, C; Irastorza, Ramiro M; Vericat, Fernando

    2016-12-01

    A quaternionic representation of the genetic code, previously reported by the authors (BioSystems 141 (10-19), 2016), is updated in order to incorporate chirality of nucleotide bases and amino acids. The original representation associates with each nucleotide base a prime integer quaternion of norm 7 and involves a function that assigns to each codon, represented by three of these quaternions, another integer quaternion (amino acid type quaternion). The assignation is such that the essentials of the standard genetic code (particularly its degeneration) are preserved. To show the advantages of such a quaternionic representation we have designed an algorithm to go from the primary to the tertiary structure of the protein. The algorithm uses, besides of the type quaternions, a second kind of quaternions with real components that we additionally associate with the amino acids according to their order along the proteins (order quaternions). In this context, we incorporate chirality in our representation by observing that the set of eight integer quaternions of norm 7 can be partitioned into a pair of subsets of cardinality four each with their elements mutually conjugate and by putting them into correspondence one to one with the two sets of enantiomers (D and L) of the four nucleotide bases adenine, cytosine, guanine and uracil, respectively. We then propose two diagrams in order to describe the hypothetical evolution of the genetic codes corresponding to both of the chiral systems of affinities: D-nucleotide bases/L-amino acids and L-nucleotide bases/D-amino acids at reading frames 5'→3' and 3'→5', respectively. Guided by these diagrams we define functions that in each case assign to the triplets of D- (L-) bases a L- (D-) amino acid type integer quaternion. Specifically, the integer quaternion associated with a given D-amino acid is the conjugate of that one corresponding to the enantiomer L. The chiral type quaternions obtained for the amino acids are used

  10. A novel construction method of QC-LDPC codes based on CRT for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  11. A code for transcription initiation in mammalian genomes

    DEFF Research Database (Denmark)

    Frith, Martin C.; Valen, Eivind Dale; Krogh, Anders

    2007-01-01

    Genome-wide detection of transcription start sites (TSSs) has revealed that RNA Polymerase II transcription initiates at millions of positions in mammalian genomes. Most core promoters do not have a single TSS, but an array of closely located TSSs with different rates of initiation. As a rule...... that initiation events are clustered on the chromosomes at multiple scales - clusters within clusters - indicating multiple regulatory processes. Within the smallest of such clusters, which can be interpreted as core promoters, the local DNA sequence predicts the relative transcription start usage of each...... nucleotide with a remarkable 91% accuracy, implying the existence of a DNA code that determines TSS selection. Conversely, the total expression strength of such clusters is only partially determined by the local DNA sequence. Thus, the overall control of transcription can be understood as a combination...

  12. Towards provably correct code generation for a hard real-time programming language

    DEFF Research Database (Denmark)

    Fränzle, Martin; Müller-Olm, Markus

    1994-01-01

    This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementatio...... correctness used as an implicit specification of the code generator pays attention to timeliness requirements. Hence, formal verification of the code generator design is a guarantee of meeting all deadlines when executing generated code....

  13. A comprehensive study of sparse codes on abnormality detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2017-01-01

    out from various angles to better un-derstand the applicability of sparse codes, including compu-tation time, reconstruction error, sparsity, detection accuracy,and their performance combining various detection methods.Experiments show that combining OMP codes with maxi-mum coordinate detection could...... achieve state-of-the-art per-formance on the UCSD dataset....

  14. Validation of Monte Carlo Geant4 code for a

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2017-01-01

    Full Text Available This study is aimed at validating the Monte Carlo Geant4.9.4 code for a 6 MV Varian linac configuring a 10 × 10 cm2 radiation field. For this purpose a user-friendly Geant4 code called G4Linac has been developed from scratch allowing an accurate modeling of a 6 MV Varian linac head and performing dose calculation in a homogeneous water phantom. Discarding the other accelerator parts where electrons are created, accelerated and deviated, a virtual source of 6 MeV electrons was considered. The parameters associated with this virtual source are often unknown. Those parameters are mean energy, sigma and its full width at half maximum has been adjusted by following our own methodology that has been developed in such a manner that the optimization phase will be fast and efficient, in fact, a small number of Monte Carlo simulations has been conducted simultaneously on a cluster of computers thanks to the Rocks cluster software. The calculated dosimetric functions in a 40 × 40 × 40 cm3 water phantom were compared to the measured ones thanks to the Gamma Index method, where the gamma criterion was fixed within 2%–1 mm accuracy. After optimization, it was observed that the proper mean energy, sigma and its full width at half maximum are 5.6 MeV, 0.42 MeV and 1.177 mm, respectively. Furthermore, we have made some changes in an existing bremsstrahlung splitting technique, due to which we have succeeded to reduce the CPU time spent by the treatment head simulation about five times.

  15. A portable virtual machine target for proof-carrying code

    DEFF Research Database (Denmark)

    Franz, Michael; Chandra, Deepak; Gal, Andreas

    2005-01-01

    Virtual Machines (VMs) and Proof-Carrying Code (PCC) are two techniques that have been used independently to provide safety for (mobile) code. Existing virtual machines, such as the Java VM, have several drawbacks: First, the effort required for safety verification is considerable. Second and more...... subtly, the need to provide such verification by the code consumer inhibits the amount of optimization that can be performed by the code producer. This in turn makes justin-time compilation surprisingly expensive. Proof-Carrying Code, on the other hand, has its own set of limitations, among which...... simultaneously providing efficient justin-time compilation and target-machine independence. In particular, our approach reduces the complexity of the required proofs, resulting in fewer proof obligations that need to be discharged at the target machine....

  16. Development of Teaching Materials for a Physical Chemistry Experiment Using the QR Code

    OpenAIRE

    吉村, 忠与志

    2008-01-01

    The development of teaching materials with the QR code was attempted in an educational environment using a mobile telephone. The QR code is not sufficiently utilized in education, and the current study is one of the first in the field. The QR code is encrypted. However, the QR code can be deciphered by mobile telephones, thus enabling the expression of text in a small space.Contents of "Physical Chemistry Experiment" which are available on the Internet are briefly summarized and simplified. T...

  17. FASOR - A second generation shell of revolution code

    Science.gov (United States)

    Cohen, G. A.

    1978-01-01

    An integrated computer program entitled Field Analysis of Shells of Revolution (FASOR) currently under development for NASA is described. When completed, this code will treat prebuckling, buckling, initial postbuckling and vibrations under axisymmetric static loads as well as linear response and bifurcation under asymmetric static loads. Although these modes of response are treated by existing programs, FASOR extends the class of problems treated to include general anisotropy and transverse shear deformations of stiffened laminated shells. At the same time, a primary goal is to develop a program which is free of the usual problems of modeling, numerical convergence and ill-conditioning, laborious problem setup, limitations on problem size and interpretation of output. The field method is briefly described, the shell differential equations are cast in a suitable form for solution by this method and essential aspects of the input format are presented. Numerical results are given for both unstiffened and stiffened anisotropic cylindrical shells and compared with previously published analytical solutions.

  18. Visualization of elastic wavefields computed with a finite difference code

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S. [Lawrence Livermore National Lab., CA (United States); Harris, D.

    1994-11-15

    The authors have developed a finite difference elastic propagation model to simulate seismic wave propagation through geophysically complex regions. To facilitate debugging and to assist seismologists in interpreting the seismograms generated by the code, they have developed an X Windows interface that permits viewing of successive temporal snapshots of the (2D) wavefield as they are calculated. The authors present a brief video displaying the generation of seismic waves by an explosive source on a continent, which propagate to the edge of the continent then convert to two types of acoustic waves. This sample calculation was part of an effort to study the potential of offshore hydroacoustic systems to monitor seismic events occurring onshore.

  19. Scalable Quantum Circuit and Control for a Superconducting Surface Code

    Science.gov (United States)

    Versluis, R.; Poletto, S.; Khammassi, N.; Tarasinski, B.; Haider, N.; Michalak, D. J.; Bruno, A.; Bertels, K.; DiCarlo, L.

    2017-09-01

    We present a scalable scheme for executing the error-correction cycle of a monolithic surface-code fabric composed of fast-flux-tunable transmon qubits with nearest-neighbor coupling. An eight-qubit unit cell forms the basis for repeating both the quantum hardware and coherent control, enabling spatial multiplexing. This control uses three fixed frequencies for all single-qubit gates and a unique frequency-detuning pattern for each qubit in the cell. By pipelining the interaction and readout steps of ancilla-based X - and Z -type stabilizer measurements, we can engineer detuning patterns that avoid all second-order transmon-transmon interactions except those exploited in controlled-phase gates, regardless of fabric size. Our scheme is applicable to defect-based and planar logical qubits, including lattice surgery.

  20. DAGON: a 3D Maxwell-Bloch code

    Science.gov (United States)

    Oliva, Eduardo; Cotelo, Manuel; Escudero, Juan Carlos; González-Fernández, Agustín.; Sanchís, Alberto; Vera, Javier; Vicéns, Sergio; Velarde, Pedro

    2017-05-01

    The amplification of UV radiation and high order harmonics (HOH) in plasmas is a subject of raising interest due to its different potential applications in several fields like environment and security (detection at distance), biology, materials science and industry (3D imaging) and atomic and plasma physics (pump-probe experiments). In order to develop these sources, it is necessary to properly understand the amplification process. Being the plasma an inhomogeneous medium which changes with time, it is desirable to have a full time-dependent 3D description of the interaction of UV and XUV radiation with plasmas. For these reasons, at the Instituto de Fusíon Nuclear we have developed DAGON, a 3D Maxwell-Bloch code capable of studying the full spationtemporal structure of the amplification process abovementioned.

  1. The Tubulin Code: A Navigation System for Chromosomes during Mitosis.

    Science.gov (United States)

    Barisic, Marin; Maiato, Helder

    2016-10-01

    Before chromosomes segregate during mitosis in metazoans, they align at the cell equator by a process known as chromosome congression. This is in part mediated by the coordinated activities of kinetochore motors with opposite directional preferences that transport peripheral chromosomes along distinct spindle microtubule populations. Because spindle microtubules are all made from the same α/β-tubulin heterodimers, a critical longstanding question has been how chromosomes are guided to specific locations during mitosis. This implies the existence of spatial cues/signals on specific spindle microtubules that are read by kinetochore motors on chromosomes and ultimately indicate the way towards the equator. Here, we discuss the emerging concept that tubulin post-translational modifications (PTMs), as part of the so-called tubulin code, work as a navigation system for kinetochore-based chromosome motility during early mitosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A MATLAB based 3D modeling and inversion code for MT data

    Science.gov (United States)

    Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.

    2017-07-01

    The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

  3. 25 CFR 12.51 - Must Indian country law enforcement officers follow a code of conduct?

    Science.gov (United States)

    2010-04-01

    ... code of conduct? 12.51 Section 12.51 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW... follow a code of conduct? All law enforcement programs receiving Bureau of Indian Affairs funding or commissioning must establish a law enforcement code of conduct which establishes specific guidelines for conduct...

  4. Inclusive innovation; a research project to assess the implementation of codes of conduct

    NARCIS (Netherlands)

    Nijhof, A.H.J.; Fisscher, O.A.M.; Laan, Albertus

    2002-01-01

    More and more organizations formulate a code of conduct to stimulate responsible action of people within the organization. Usually much time and energy is spent fixing the content of the code. Then there is the challenge of implementing and maintaining the code. This is a tricky process in which too

  5. An object-oriented scripting interface to a legacy electronic structure code

    DEFF Research Database (Denmark)

    Bahn, Sune Rastad; Jacobsen, Karsten Wedel

    2002-01-01

    The authors have created an object-oriented scripting interface to a mature density functional theory code. The interface gives users a high-level, flexible handle on the code without rewriting the underlying number-crunching code. The authors also discuss design issues and the advantages of homo...

  6. Upper bounds on the number of errors corrected by a convolutional code

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2004-01-01

    We derive upper bounds on the weights of error patterns that can be corrected by a convolutional code with given parameters, or equivalently we give bounds on the code rate for a given set of error patterns. The bounds parallel the Hamming bound for block codes by relating the number of error...... patterns to the number of distinct syndromes....

  7. LSENS, a general chemical kinetics and sensitivity analysis code for homogeneous gas-phase reactions. 2: Code description and usage

    Science.gov (United States)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 2 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 2 describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part 1 (NASA RP-1328) derives the governing equations describes the numerical solution procedures for the types of problems that can be solved by lSENS. Part 3 (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  8. Parallelization of a numerical simulation code for isotropic turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Shigeru; Yokokawa, Mitsuo; Watanabe, Tadashi; Kaburaki, Hideo

    1996-03-01

    A parallel pseudospectral code which solves the three-dimensional Navier-Stokes equation by direct numerical simulation is developed and execution time, parallelization efficiency, load balance and scalability are evaluated. A vector parallel supercomputer, Fujitsu VPP500 with up to 16 processors is used for this calculation for Fourier modes up to 256x256x256 using 16 processors. Good scalability for number of processors is achieved when number of Fourier mode is fixed. For small Fourier modes, calculation time of the program is proportional to NlogN which is ideal complexity of calculation for 3D-FFT on vector parallel processors. It is found that the calculation performance decreases as the increase of the Fourier modes. (author).

  9. A Secure RFID Authentication Protocol Adopting Error Correction Code

    Directory of Open Access Journals (Sweden)

    Chien-Ming Chen

    2014-01-01

    Full Text Available RFID technology has become popular in many applications; however, most of the RFID products lack security related functionality due to the hardware limitation of the low-cost RFID tags. In this paper, we propose a lightweight mutual authentication protocol adopting error correction code for RFID. Besides, we also propose an advanced version of our protocol to provide key updating. Based on the secrecy of shared keys, the reader and the tag can establish a mutual authenticity relationship. Further analysis of the protocol showed that it also satisfies integrity, forward secrecy, anonymity, and untraceability. Compared with other lightweight protocols, the proposed protocol provides stronger resistance to tracing attacks, compromising attacks and replay attacks. We also compare our protocol with previous works in terms of performance.

  10. A secure RFID authentication protocol adopting error correction code.

    Science.gov (United States)

    Chen, Chien-Ming; Chen, Shuai-Min; Zheng, Xinying; Chen, Pei-Yu; Sun, Hung-Min

    2014-01-01

    RFID technology has become popular in many applications; however, most of the RFID products lack security related functionality due to the hardware limitation of the low-cost RFID tags. In this paper, we propose a lightweight mutual authentication protocol adopting error correction code for RFID. Besides, we also propose an advanced version of our protocol to provide key updating. Based on the secrecy of shared keys, the reader and the tag can establish a mutual authenticity relationship. Further analysis of the protocol showed that it also satisfies integrity, forward secrecy, anonymity, and untraceability. Compared with other lightweight protocols, the proposed protocol provides stronger resistance to tracing attacks, compromising attacks and replay attacks. We also compare our protocol with previous works in terms of performance.

  11. A Comparison of Source Code Plagiarism Detection Engines

    Science.gov (United States)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  12. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show...

  13. VISWAM. A computer code package for thermal reactor physics computations

    Energy Technology Data Exchange (ETDEWEB)

    Jagannathan, V.; Thiyagarajan, T.K.; Ganesan, S.; Jain, R.P.; Pal, U. [Bhabha Atomic Research Centre, Mumbai (India); Karthikeyan, R. [Ecole Polytechnique de Montreal, Montreal, Quebec (Canada)

    2004-07-01

    The nuclear cross section data and reactor physics design methods developed over the past three decades have attained a high degree of reliability for thermal power reactor design and analysis. This is borne out from the analysis of physics commissioning experiments and several reactor-years of operational experience of two types of Indian thermal power reactors, viz. BWRs and PHWRs. Our computational tools were also developed and tested against a large number of IAEA CRP benchmarks on in-core fuel management code package validation for the modern BWR, PWR, VVER and PHWR. Though the computational algorithms are well tested, their mode of use has remained rather obsolete since the codes were developed when the modern high-speed large memory computers were not available. The use of Fortran language limits their potential use for varied applications. We are developing specific Visual Interface Software as the Work Aid support for effective Man-Machine interface (VISWAM). The VISWAM package when fully developed and tested will enable handling the input description of complex fuel assembly and the reactor core geometry with immaculate ease. Selective display of the three dimensional distribution of multi-group fluxes, power distribution and hot spots will provide a good insight into the analysis and also enable inter comparison of different nuclear datasets and methods. Since the new package will be user-friendly, training of requisite human resource for the expanding Indian nuclear power programme will be rendered easier and the gap between an expert and a novice will be greatly reduced. (author)

  14. A novel reporting approach to coronary angiography: "segmental coding system".

    Science.gov (United States)

    Konuralp, Cüneyt; Idiz, Mustafa; Ateş, Mehmet

    2005-01-01

    A new systematic reporting system for coronary angiography has been developed, which is capable of describing any visible intraluminal or extraluminal conditions with the exact coordinates. In this method, called "segmental coding system"(SCS), the part of the artery that is located between its two subsequent branches is considered to be an "angiographic segment". Conditions are localized according to their relationship with these angiographic segments and the anatomic border of the segments (coronary ostiums, primary, secondary and tertiary branches, grafts and proximal and distal anastomosis sites). They are also described by using a special coding system that consists of letters, numbers and signs. SCS can supply the name (stenosis, occlusion, contour deformity, aneurysm, rupture, anatomical variation, existence of stent, etc.) and the exact localization (coordinates) of the condition with its properties; filling direction, and the collateral system that fills the vessel. We applied SCS to more than 500 cineangiograpies. According to our experience, SCS provides more objective, detailed, and even correct information than the current narrative reporting system. SCS also offers many extra advantages. (a) It can describe all imaginable types of lesion combinations. (b) All of the existing conditions can be listed without missing. (c) The definitions are very precise and clear. They can easily be understood by everyone in the same way. (d) It is more advantageous on archiving, searching the database, and comparing the subsequent reports for the same patient. (e) In the future, by using specially tailored software, personal and detailed angiographic images will be reproduced from the SCS data. By being introduced into clinical practice, we believe, SCS will prove a very useful tool for both surgeons and cardiologists.

  15. Ideas for Advancing Code Sharing (A Different Kind of Hack Day)

    OpenAIRE

    Teuben, Peter; Allen, Alice; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica; Nemiroff, Robert; Shamir, Lior; Shortridge, Keith; Taylor, Mark; Wallin, John

    2013-01-01

    How do we as a community encourage the reuse of software for telescope operations, data processing, and calibration? How can we support making codes used in research available for others to examine? Continuing the discussion from last year Bring out your codes! BoF session, participants separated into groups to brainstorm ideas to mitigate factors which inhibit code sharing and nurture those which encourage code sharing. The BoF concluded with the sharing of ideas that arose from the brainsto...

  16. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  17. 24 CFR 200.926a - Residential building code comparison items.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Residential building code... § 200.926a Residential building code comparison items. HUD will review each local and State code... doors and windows; (5) Unit smoke detectors; (6) Flame spread. (b) Light and ventilation. (1) Habitable...

  18. Code-Switching in English as a Foreign Language Classroom: Teachers' Attitudes

    Science.gov (United States)

    Ibrahim, Engku Haliza Engku; Shah, Mohamed Ismail Ahamad; Armia, Najwa Tgk.

    2013-01-01

    Code-switching has always been an intriguing phenomenon to sociolinguists. While the general attitude to it seems negative, people seem to code-switch quite frequently. Teachers of English as a foreign language too frequently claim that they do not like to code-switch in the language classroom for various reasons--many are of the opinion that only…

  19. Concealed holographic coding for security applications by using a moire technique

    DEFF Research Database (Denmark)

    Zhang, Xiangsu; Dalsgaard, Erik

    1997-01-01

    We present an optical coding technique that enhances the anticounterfeiting power of security holograms. The principles of the technique is based on the moire phenomenon. The code in the hologram has a phase pattern that is invisible and cannot be detected by optical equipment, so that imitation...... is extremely difficult. Holographic, photographic and embossing technique are used in fabricating coded holograms and decoders....

  20. A critical analysis of the use of code-switching in Nhlapho's novel ...

    African Journals Online (AJOL)

    Code-switching has become a common social phenomenon governed by social conversational needs. Central to the use of code-switching is the way in which social norms, which are also called rights and obligations, are attributed to speakers and listeners of certain social categories. Studies on code-switching reveal that ...

  1. A MacWilliams Identity for Convolutional Codes : The General Case

    NARCIS (Netherlands)

    Gluesing-Luerssen, Heide; Schneider, Gert

    A MacWilliams Identity for convolutional codes will be established. It makes use of the weight adjacency matrices of the code and its dual, based on state space realizations (the controller canonical form) of the codes in question. The MacWilliams Identity applies to various notions of duality

  2. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  3. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and

  4. A Large Scale Code Resolution Service Network in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Xiangzhan Yu

    2012-11-01

    Full Text Available In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  5. Arbitrariness is not enough: towards a functional approach to the genetic code.

    Science.gov (United States)

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  6. Student Dress Codes in Public Schools: A Selective Annotated Bibliography

    National Research Council Canada - National Science Library

    Joan Pedzich

    2002-01-01

    P1 In an attempt to curb the rising presence of gangs in public schools and to reduce disciplinary conflicts, officials in school districts across the United States are implementing dress codes or introducing uniforms...

  7. On a class of repeated-root monomial-like abelian codes

    Directory of Open Access Journals (Sweden)

    Edgar Martinez-Moro

    2015-05-01

    Full Text Available In this paper we study polycyclic codes of length $p^{s_1} \\times \\cdots \\times p^{s_n}$\\ over $\\F_{p^a}$\\ generated by a single monomial. These codes form a special class of abelian codes. We show that these codes arise from the product of certain single variable codes and we determine their minimum Hamming distance. Finally we extend the results of Massey et. al. on the weight retaining property of monomials in one variable to the weight retaining property of monomials in several variables.

  8. Folklore in bureaucracy code: Running a music event

    Directory of Open Access Journals (Sweden)

    Krstanović-Lukić Miroslava

    2004-01-01

    Full Text Available A music folk-created piece of work is a construction expressed as a paradigm part of a set in the bureaucracy system and the public arena. Such a work is a mechanical concept, which defines inheritance as a construction of authenticity saturated with elements of folk, national culture. It is also a subject of certain conventions in the system of regulations; namely, it is a part of the administrative code. The usage of the folk created work as a paradigm and legislations is realized through an organizational apparatus that is, it becomes entertainment, a spectacle. This paper analyzes the functioning of the organizational machinery of a folk spectacle, starting with the government authorities, local self-management and the spectacle's administrative committees. To illustrate this phenomenon, the paper presents the development of a trumpet playing festival in Dragačevo. This particular festival establishes a cultural, economic and political order with a clear and defined division of power. The analysis shows that the folk event in question, through its programs and activities, represents a scene and arena of individual and group interests. Organizational interactions are recognized in binary oppositions: sovereignty/dependency official/unofficial, dominancy/ subordination, innovative/inherited common/different, needed/useful, original/copy, one's own/belonging to someone else.

  9. Development of a transient three-dimensional neutron transport code with feedback

    Energy Technology Data Exchange (ETDEWEB)

    Waddell, M.W. Jr.

    1994-07-19

    A new code is being developed at the Y-12 Plant for solving the time-dependent, three-dimensional Boltzmann transport model with feedback. The new code, PADK, uses the quasi-static method in its adiabatic form and is to be utilized to analyze hypothetical criticality accidents. A description of the code along with preliminary results without feedback are presented in this paper. The code is applied to 2 standard benchmark problems and the results are compared to another method. Also, the code is used to model the GODIVA reactor. Further work needed to be completed is described.

  10. Development of a transient three-dimensional neutron transport code with feedback

    Energy Technology Data Exchange (ETDEWEB)

    Waddell, M.W. Jr.

    1994-12-31

    A new code is being developed at the Y-12 plant for solving the time-dependent, three-dimensional Boltzmann transport model with feedback. The new code, PADK, uses the quasi-static method in its adiabatic form and is to be utilized to analyze hypothetical criticality accidents. A description of the code along with preliminary results without feedback are presented in this paper. The code is applied to two standard benchmark problems, and the results are compared to another method. Also, the code is used to model the GODIVA reactor. Further work needed to be completed is described.

  11. Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide

    Science.gov (United States)

    McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger

    2015-01-01

    Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837

  12. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    Science.gov (United States)

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Assessement of Codes and Standards Applicable to a Hydrogen Production Plant Coupled to a Nuclear Reactor

    Energy Technology Data Exchange (ETDEWEB)

    M. J. Russell

    2006-06-01

    This is an assessment of codes and standards applicable to a hydrogen production plant to be coupled to a nuclear reactor. The result of the assessment is a list of codes and standards that are expected to be applicable to the plant during its design and construction.

  14. JSPAM: A restricted three-body code for simulating interacting galaxies

    Science.gov (United States)

    Wallin, J. F.; Holincheck, A. J.; Harvey, A.

    2016-07-01

    Restricted three-body codes have a proven ability to recreate much of the disturbed morphology of actual interacting galaxies. As more sophisticated n-body models were developed and computer speed increased, restricted three-body codes fell out of favor. However, their supporting role for performing wide searches of parameter space when fitting orbits to real systems demonstrates a continuing need for their use. Here we present the model and algorithm used in the JSPAM code. A precursor of this code was originally described in 1990, and was called SPAM. We have recently updated the software with an alternate potential and a treatment of dynamical friction to more closely mimic the results from n-body tree codes. The code is released publicly for use under the terms of the Academic Free License ("AFL") v. 3.0 and has been added to the Astrophysics Source Code Library.

  15. Optimal Index Codes for a Class of Multicast Networks with Receiver Side Information

    CERN Document Server

    Ong, Lawrence

    2012-01-01

    This paper studies a special class of multicast index coding problems where a sender transmits messages to multiple receivers, each with some side information. Here, each receiver knows a unique message a priori, and there is no restriction on how many messages each receiver requests from the sender. For this class of multicast index coding problems, we obtain the optimal index code, which has the shortest codelength for which the sender needs to send in order for all receivers to obtain their (respective) requested messages. This is the first class of index coding problems where the optimal index codes are found. In addition, linear index codes are shown to be optimal for this class of index coding problems.

  16. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  17. New versions of VENTURE/PC, a multigroup, multidimensional diffusion-depletion code system

    Energy Technology Data Exchange (ETDEWEB)

    Shapiro, A.; Huria, H.C. (Univ. of Cincinnati, OH (United States))

    1990-01-01

    VENTURE/PC is a microcomputer version of the BOLD VENTURE code system developed over a period of years at Oak Ridge National Laboratory (ORNL). It is a very complete and flexible multigroup, multidimensional, diffusion-depletion code system, which was developed for the Idaho National Engineering Laboratory personal computer (PC) based reactor physics and radiation shielding analysis package. The major characteristics of the code system were reported previously. Since that time, new versions of the code system have been developed. These new versions were designed to speed the convergence process, simplify the input stream, and extend the code to the state-of-the-art 32-bit microcomputers. The 16-bit version of the code is distributed by the Radiation Shielding Information Center (RSIC) at ORNL. The code has received widespread usage.

  18. Improving residents' code status discussion skills: a randomized trial.

    Science.gov (United States)

    Szmuilowicz, Eytan; Neely, Kathy J; Sharma, Rashmi K; Cohen, Elaine R; McGaghie, William C; Wayne, Diane B

    2012-07-01

    Inpatient Code Status Discussions (CSDs) are commonly facilitated by resident physicians, despite inadequate training. We studied the efficacy of a CSD communication skills training intervention for internal medicine residents. This was a prospective, randomized controlled trial of a multimodality communication skills educational intervention for postgraduate year (PGY) 1 residents. Intervention group residents completed a 2 hour teaching session with deliberate practice of communication skills, online modules, self-reflection, and a booster training session in addition to assigned clinical rotations. Control group residents completed clinical rotations alone. CSD skills of residents in both groups were assessed 2 months after the intervention using an 18 item behavioral checklist during a standardized patient encounter. Average scores for intervention and control group residents were calculated and between-group differences on the CSD skills assessment were evaluated using two-tailed independent sample t tests. Intervention group residents displayed higher overall scores on the simulated CSD (75.1% versus 53.2%, pgroup residents. The intervention group also displayed a greater number of key CSD communication behaviors and facilitated significantly longer conversations. The training, evaluation, and feedback sessions were rated highly. A focused, multimodality curriculum can improve resident performance of simulated CSDs. Skill improvement lasted for at least 2 months after the intervention. Further studies are needed to assess skill retention and to set minimum performance standards.

  19. Modeling Vortex Generators in a Navier-Stokes Code

    Science.gov (United States)

    Dudek, Julianne C.

    2011-01-01

    A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.

  20. HELIOS: A new open-source radiative transfer code

    Science.gov (United States)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  1. CodeRAnts: A recommendation method based on collaborative searching and ant colonies, applied to reusing of open source code

    Directory of Open Access Journals (Sweden)

    Isaac Caicedo-Castro

    2014-01-01

    Full Text Available This paper presents CodeRAnts, a new recommendation method based on a collaborative searching technique and inspired on the ant colony metaphor. This method aims to fill the gap in the current state of the matter regarding recommender systems for software reuse, for which prior works present two problems. The first is that, recommender systems based on these works cannot learn from the collaboration of programmers and second, outcomes of assessments carried out on these systems present low precision measures and recall and in some of these systems, these metrics have not been evaluated. The work presented in this paper contributes a recommendation method, which solves these problems.

  2. Teaching, Morality, and Responsibility: A Structuralist Analysis of a Teachers' Code of Conduct

    Science.gov (United States)

    Shortt, Damien; Hallett, Fiona; Spendlove, David; Hardy, Graham; Barton, Amanda

    2012-01-01

    In this paper we conduct a Structuralist analysis of the General Teaching Council for England's "Code of Conduct and Practice for Registered Teachers" in order to reveal how teachers are required to fulfil an apparently impossible social role. The GTCE's "Code," we argue, may be seen as an attempt by a government agency to…

  3. TACI: a code for interactive analysis of neutron data produced by a tissue equivalent proportional counter

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, F.M.

    1984-06-01

    The TEPC analysis code (TACI) is a computer program designed to analyze pulse height data generated by a tissue equivalent proportional counter (TEPC). It is written in HP BASIC and is for use on an HP-87XM personal computer. The theory of TEPC analysis upon which this code is based is summarized.

  4. Code-Switching as a Verbal Strategy Among Chinese in a Campus Setting in Taiwan.

    Science.gov (United States)

    Chen, Su-Chiao

    1996-01-01

    Explores verbal strategies involving code-switching (English terms used in Chinese-based interactions) in the speech community of a Taiwanese teacher's college. Code-switching is described in terms of the fulfillment of language functions and is shown to express a linguistic style concerned with communicative appropriateness and social identity.…

  5. Rewriting the epigenetic code for tumor resensitization: a review.

    Science.gov (United States)

    Oronsky, Bryan; Oronsky, Neil; Scicinski, Jan; Fanger, Gary; Lybeck, Michelle; Reid, Tony

    2014-10-01

    In cancer chemotherapy, one axiom, which has practically solidified into dogma, is that acquired resistance to antitumor agents or regimens, nearly inevitable in all patients with metastatic disease, remains unalterable and irreversible, rendering therapeutic rechallenge futile. However, the introduction of epigenetic therapies, including histone deacetylase inhibitors (HDACis) and DNA methyltransferase inhibitors (DNMTIs), provides oncologists, like computer programmers, with new techniques to "overwrite" the modifiable software pattern of gene expression in tumors and challenge the "one and done" treatment prescription. Taking the epigenetic code-as-software analogy a step further, if chemoresistance is the product of multiple nongenetic alterations, which develop and accumulate over time in response to treatment, then the possibility to hack or tweak the operating system and fall back on a "system restore" or "undo" feature, like the arrow icon in the Windows XP toolbar, reconfiguring the tumor to its baseline nonresistant state, holds tremendous promise for turning advanced, metastatic cancer from a fatal disease into a chronic, livable condition. This review aims 1) to explore the potential mechanisms by which a group of small molecule agents including HDACis (entinostat and vorinostat), DNMTIs (decitabine and 5-azacytidine), and redox modulators (RRx-001) may reprogram the tumor microenvironment from a refractory to a nonrefractory state, 2) highlight some recent findings, and 3) discuss whether the current "once burned forever spurned" paradigm in the treatment of metastatic disease should be revised to promote active resensitization attempts with formerly failed chemotherapies.

  6. Rewriting the Epigenetic Code for Tumor Resensitization: A Review

    Directory of Open Access Journals (Sweden)

    Bryan Oronsky

    2014-10-01

    Full Text Available In cancer chemotherapy, one axiom, which has practically solidified into dogma, is that acquired resistance to antitumor agents or regimens, nearly inevitable in all patients with metastatic disease, remains unalterable and irreversible, rendering therapeutic rechallenge futile. However, the introduction of epigenetic therapies, including histone deacetylase inhibitors (HDACis and DNA methyltransferase inhibitors (DNMTIs, provides oncologists, like computer programmers, with new techniques to “overwrite” the modifiable software pattern of gene expression in tumors and challenge the “one and done” treatment prescription. Taking the epigenetic code-as-software analogy a step further, if chemoresistance is the product of multiple nongenetic alterations, which develop and accumulate over time in response to treatment, then the possibility to hack or tweak the operating system and fall back on a “system restore” or “undo” feature, like the arrow icon in the Windows XP toolbar, reconfiguring the tumor to its baseline nonresistant state, holds tremendous promise for turning advanced, metastatic cancer from a fatal disease into a chronic, livable condition. This review aims 1 to explore the potential mechanisms by which a group of small molecule agents including HDACis (entinostat and vorinostat, DNMTIs (decitabine and 5-azacytidine, and redox modulators (RRx-001 may reprogram the tumor microenvironment from a refractory to a nonrefractory state, 2 highlight some recent findings, and 3 discuss whether the current “once burned forever spurned” paradigm in the treatment of metastatic disease should be revised to promote active resensitization attempts with formerly failed chemotherapies.

  7. GOVERNANCE CODES: FACTS OR FICTIONS? A STUDY OF GOVERNANCE CODES IN COLOMBIA

    Directory of Open Access Journals (Sweden)

    JULIÁN BENAVIDES FRANCO

    2010-01-01

    para autorregularse, reduciendo sus beneficios privados y/o la expropiación de las partes no controladoras, a través de la introducción del código, es en realidad una medida efectiva y que los mercados financieros apoyan, incrementando el suministro de fondos a las firmas.

  8. Plato: A localised orbital based density functional theory code

    Science.gov (United States)

    Kenny, S. D.; Horsfield, A. P.

    2009-12-01

    The Plato package allows both orthogonal and non-orthogonal tight-binding as well as density functional theory (DFT) calculations to be performed within a single framework. The package also provides extensive tools for analysing the results of simulations as well as a number of tools for creating input files. The code is based upon the ideas first discussed in Sankey and Niklewski (1989) [1] with extensions to allow high-quality DFT calculations to be performed. DFT calculations can utilise either the local density approximation or the generalised gradient approximation. Basis sets from minimal basis through to ones containing multiple radial functions per angular momenta and polarisation functions can be used. Illustrations of how the package has been employed are given along with instructions for its utilisation. Program summaryProgram title: Plato Catalogue identifier: AEFC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 219 974 No. of bytes in distributed program, including test data, etc.: 1 821 493 Distribution format: tar.gz Programming language: C/MPI and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux and Mac OS X Has the code been vectorised or parallelised?: Yes, up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Nature of problem: Density functional theory study of electronic structure and total energies of molecules, crystals and surfaces. Solution method: Localised orbital based density functional theory. Restrictions: Tight-binding and density functional theory only, no exact exchange. Unusual features: Both atom centred and uniform meshes available

  9. Relative efficiency calculation of a HPGe detector using MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Medeiros, Marcos P.C.; Rebello, Wilson F., E-mail: eng.cavaliere@ime.eb.br, E-mail: rebello@ime.eb.br [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Secao de Engenharia Nuclear; Lopes, Jose M.; Silva, Ademir X., E-mail: marqueslopez@yahoo.com.br, E-mail: ademir@nuclear.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2015-07-01

    High-purity germanium detectors (HPGe) are mandatory tools for spectrometry because of their excellent energy resolution. The efficiency of such detectors, quoted in the list of specifications by the manufacturer, frequently refers to the relative full-energy peak efficiency, related to the absolute full-energy peak efficiency of a 7.6 cm x 7.6 cm (diameter x height) NaI(Tl) crystal, based on the 1.33 MeV peak of a {sup 60}Co source positioned 25 cm from the detector. In this study, we used MCNPX code to simulate a HPGe detector (Canberra GC3020), from Real-Time Neutrongraphy Laboratory of UFRJ, to survey the spectrum of a {sup 60}Co source located 25 cm from the detector in order to calculate and confirm the efficiency declared by the manufacturer. Agreement between experimental and simulated data was achieved. The model under development will be used for calculating and comparison purposes with the detector calibration curve from software Genie2000™, also serving as a reference for future studies. (author)

  10. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  11. Financial and clinical governance implications of clinical coding accuracy in neurosurgery: a multidisciplinary audit.

    Science.gov (United States)

    Haliasos, N; Rezajooi, K; O'neill, K S; Van Dellen, J; Hudovsky, Anita; Nouraei, Sar

    2010-04-01

    Clinical coding is the translation of documented clinical activities during an admission to a codified language. Healthcare Resource Groupings (HRGs) are derived from coding data and are used to calculate payment to hospitals in England, Wales and Scotland and to conduct national audit and benchmarking exercises. Coding is an error-prone process and an understanding of its accuracy within neurosurgery is critical for financial, organizational and clinical governance purposes. We undertook a multidisciplinary audit of neurosurgical clinical coding accuracy. Neurosurgeons trained in coding assessed the accuracy of 386 patient episodes. Where clinicians felt a coding error was present, the case was discussed with an experienced clinical coder. Concordance between the initial coder-only clinical coding and the final clinician-coder multidisciplinary coding was assessed. At least one coding error occurred in 71/386 patients (18.4%). There were 36 diagnosis and 93 procedure errors and in 40 cases, the initial HRG changed (10.4%). Financially, this translated to pound111 revenue-loss per patient episode and projected to pound171,452 of annual loss to the department. 85% of all coding errors were due to accumulation of coding changes that occurred only once in the whole data set. Neurosurgical clinical coding is error-prone. This is financially disadvantageous and with the coding data being the source of comparisons within and between departments, coding inaccuracies paint a distorted picture of departmental activity and subspecialism in audit and benchmarking. Clinical engagement improves accuracy and is encouraged within a clinical governance framework.

  12. Saphyr: a code system from reactor design to reference calculations

    Energy Technology Data Exchange (ETDEWEB)

    Akherraz, B.; Baudron, A.M.; Buiron, L.; Coste-Delclaux, M.; Fedon-Magnaud, C.; Lautard, J.J.; Moreau, F.; Nicolas, A.; Sanchez, R.; Zmijarevic, I. [CEA Saclay, Direction de l' Energie Nucleaire, Departement de Modelisation des Systemes et Structures, Service d' Etudes des Reacteurs et de Modelisation Avancee (DENDMSS/SERMA), 91 - Gif sur Yvette (France); Bergeron, A.; Caruge, D.; Fillion, P.; Gallo, D.; Royer, E. [CEA Saclay, Direction de l' Energie Nucleaire, Departement de Modelisation des Systemes et Structures, Service Fluides numeriques, Modelisations et Etudes (DEN/DMSS/SFNME), 91 - Gif sur Yvette (France); Loubiere, S. [CEA Saclay, Direction de l' Energie Nucleaire, Direction de la Simulation et des Outils Experimentaux, 91- Gif sur Yvette (France)

    2003-07-01

    In this paper we briefly present the package SAPHYR (in French Advanced System for Reactor Physics) which is devoted to reactor calculations, safety analysis and design. This package is composed of three main codes: APOLLO2 for lattice calculations, CRONOS2 for whole core neutronic calculations and FLICA4 for thermohydraulics. Thanks to a continuous development effort, the SAPHYR system is an outstanding tool covering a large domain of applications, from sophisticated 'research and development' studies that need state-of-the-art methodology to routine industrial calculations for reactor and criticality analysis. SAPHYR is powerful enough to carry out calculations for all types of reactors and is invaluable to understand complex phenomena. SAPHYR components are in use in various nuclear companies such as 'Electricite de France', Framatome-ANP, Cogema, SGN, Transnucleaire and Technicatome. Waiting for the next generation tools (DESCARTES for neutronics and NEPTUNE for thermohydraulics) to be available for such a variety of use, with a better level of flexibility and at least equivalent validation and qualification level, the improvement of SAPHYR is going on, to acquire new functions constantly required by users and to improve current performance levels.

  13. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    Science.gov (United States)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  14. Fast-coding robust motion estimation model in a GPU

    Science.gov (United States)

    García, Carlos; Botella, Guillermo; de Sande, Francisco; Prieto-Matias, Manuel

    2015-02-01

    Nowadays vision systems are used with countless purposes. Moreover, the motion estimation is a discipline that allow to extract relevant information as pattern segmentation, 3D structure or tracking objects. However, the real-time requirements in most applications has limited its consolidation, considering the adoption of high performance systems to meet response times. With the emergence of so-called highly parallel devices known as accelerators this gap has narrowed. Two extreme endpoints in the spectrum of most common accelerators are Field Programmable Gate Array (FPGA) and Graphics Processing Systems (GPU), which usually offer higher performance rates than general propose processors. Moreover, the use of GPUs as accelerators involves the efficient exploitation of any parallelism in the target application. This task is not easy because performance rates are affected by many aspects that programmers should overcome. In this paper, we evaluate OpenACC standard, a programming model with directives which favors porting any code to a GPU in the context of motion estimation application. The results confirm that this programming paradigm is suitable for this image processing applications achieving a very satisfactory acceleration in convolution based problems as in the well-known Lucas & Kanade method.

  15. Phylogenetic comparison of protein-coding versus ribosomal RNA-coding sequence data: a case study of the Lecanoromycetes (Ascomycota).

    Science.gov (United States)

    Hofstetter, Valérie; Miadlikowska, Jolanta; Kauff, Frank; Lutzoni, François

    2007-07-01

    The resolving power and statistical support provided by two protein-coding (RPB1 and RPB2) and three ribosomal RNA-coding (nucSSU, nucLSU, and mitSSU) genes individually and in various combinations were investigated based on maximum likelihood bootstrap analyses on lichen-forming fungi from the class Lecanoromycetes (Ascomycota). Our results indicate that the optimal loci (single and combined) to use for molecular systematics of lichen-forming Ascomycota are protein-coding genes (RPB1 and RPB2). RPB1 and RPB2 genes individually were phylogenetically more efficient than all two- and three-locus combinations of ribosomal loci. The 3rd codon position of each of these two loci provided the most characters in support of phylogenetic relationships within the Lecanoromycetes. Of the three ribosomal loci we used in this study, mitSSU contributed the most to phylogenetic analyses when combined with RPB1 and RPB2. Except for the mitSSU, ribosomal genes were the most difficult to recover because they often contain many introns, resulting in PCR bias toward numerous and intronless co-extracted contaminant fungi (mainly Dothideomycetes, Chaetothyriomycetes, and Sordariomycetes in the Ascomycota, and members of the Basidiomycota), which inhabit lichen thalli. Maximum likelihood analysis on the combined five-locus data set for 82 members of the Lecanoromycetes provided a well resolved and well supported tree compared to existing phylogenies. We confirmed the monophyly of three recognized subclasses in the Lecanoromycetes, the Acarosporomycetidae, Ostropomycetidae, and Lecanoromycetideae; the latter delimited as monophyletic for the first time, with the exclusion of the family Umbilicariaceae and Hypocenomyce scalaris. The genus Candelariella (formerly in the Candelariaceae, currently a member of the Lecanoraceae) represents the first evolutionary split within the Lecanoromycetes, before the divergence of the Acarosporomycetidae. This study provides a foundation necessary to guide

  16. A Simple Scheme for Belief Propagation Decoding of BCH and RS Codes in Multimedia Transmissions

    Directory of Open Access Journals (Sweden)

    Marco Baldi

    2008-01-01

    Full Text Available Classic linear block codes, like Bose-Chaudhuri-Hocquenghem (BCH and Reed-Solomon (RS codes, are widely used in multimedia transmissions, but their soft-decision decoding still represents an open issue. Among the several approaches proposed for this purpose, an important role is played by the iterative belief propagation principle, whose application to low-density parity-check (LDPC codes permits to approach the channel capacity. In this paper, we elaborate a new technique for decoding classic binary and nonbinary codes through the belief propagation algorithm. We focus on RS codes included in the recent CDMA2000 standard, and compare the proposed technique with the adaptive belief propagation approach, that is able to ensure very good performance but with higher complexity. Moreover, we consider the case of long BCH codes included in the DVB-S2 standard, for which we show that the usage of “pure” LDPC codes would provide better performance.

  17. Development of Code-Switching: A Case Study on a Turkish/English/Arabic Multilingual Child

    Science.gov (United States)

    Tunaz, Mehmet

    2016-01-01

    The purpose of this research was to investigate the early code switching patterns of a simultaneous multilingual subject (Aris) in accordance with Muysken's (2000) code switching typology: insertion and alternation. Firstly, the records of naturalistic spontaneous conversations were obtained from the parents via e-mail, phone calls and…

  18. On the performance of a 2D unstructured computational rheology code on a GPU

    NARCIS (Netherlands)

    Pereira, S.P.; Vuik, K.; Pinho, F.T.; Nobrega, J.M.

    2013-01-01

    The present work explores the massively parallel capabilities of the most advanced architecture of graphics processing units (GPUs) code named “Fermi”, on a two-dimensional unstructured cell-centred finite volume code. We use the SIMPLE algorithm to solve the continuity and momentum equations that

  19. Seismicity and Design Codes in Chile: Characteristic Features and a Comparison with Some of the Provisions of the Romanian Seismic Design Code

    OpenAIRE

    Ene, Diana; Craifaleanu, Iolanda-Gabriela

    2010-01-01

    A brief history and the characteristics of the seismic region and events in Chile reveal interesting indices in understanding the present day Chilean seismic design code. The paper points out some of the most important prescriptions in the Chilean code that could have led to the relatively reduced number of casualties at the seismic event on February 27th, 2010. By comparing the Chilean code to the Romanian one, the goal is to underline the differences and the similarities regarding both the ...

  20. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  1. Generalized concatenated quantum codes

    Science.gov (United States)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng, Bei

    2009-05-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  2. The chromodomain-containing histone acetyltransferase TIP60 acts as a code reader, recognizing the epigenetic codes for initiating transcription.

    Science.gov (United States)

    Kim, Chul-Hong; Kim, Jung-Woong; Jang, Sang-Min; An, Joo-Hee; Seo, Sang-Beom; Choi, Kyung-Hee

    2015-01-01

    TIP60 can act as a transcriptional activator or a repressor depending on the cellular context. However, little is known about the role of the chromodomain in the functional regulation of TIP60. In this study, we found that TIP60 interacted with H3K4me3 in response to TNF-α signaling. TIP60 bound to H3K4me3 at the promoters of the NF-κB target genes IL6 and IL8. Unlike the wild-type protein, a TIP60 chromodomain mutant did not localize to chromatin regions. Because TIP60 binds to histones with specific modifications and transcriptional regulators, we used a histone peptide assay to identify histone codes recognized by TIP60. TIP60 preferentially interacted with methylated or acetylated histone H3 and H4 peptides. Phosphorylation near a lysine residue significantly reduced the affinity of TIP60 for the modified histone peptides. Our findings suggest that TIP60 acts as a functional link between the histone code and transcriptional regulators.

  3. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  4. Rotated Walsh-Hadamard Spreading with Robust Channel Estimation for a Coded MC-CDMA System

    Directory of Open Access Journals (Sweden)

    Raulefs Ronald

    2004-01-01

    Full Text Available We investigate rotated Walsh-Hadamard spreading matrices for a broadband MC-CDMA system with robust channel estimation in the synchronous downlink. The similarities between rotated spreading and signal space diversity are outlined. In a multiuser MC-CDMA system, possible performance improvements are based on the chosen detector, the channel code, and its Hamming distance. By applying rotated spreading in comparison to a standard Walsh-Hadamard spreading code, a higher throughput can be achieved. As combining the channel code and the spreading code forms a concatenated code, the overall minimum Hamming distance of the concatenated code increases. This asymptotically results in an improvement of the bit error rate for high signal-to-noise ratio. Higher convolutional channel code rates are mostly generated by puncturing good low-rate channel codes. The overall Hamming distance decreases significantly for the punctured channel codes. Higher channel code rates are favorable for MC-CDMA, as MC-CDMA utilizes diversity more efficiently compared to pure OFDMA. The application of rotated spreading in an MC-CDMA system allows exploiting diversity even further. We demonstrate that the rotated spreading gain is still present for a robust pilot-aided channel estimator. In a well-designed system, rotated spreading extends the performance by using a maximum likelihood detector with robust channel estimation at the receiver by about 1 dB.

  5. Connect and immerse: a poetry of codes and signals

    Directory of Open Access Journals (Sweden)

    Jesper Olsson

    2012-06-01

    Full Text Available This article investigates how codes and signals were employed in avant-garde poetry and art in the 1960s, and how such attempts were performed in the wake of cybernetics and (partly through the use of new media technologies, such as the tape recorder and the computer. This poetry—as exemplified here by works by Åke Hodell, Peter Weibel, and Henri Chopin—not only employed new materials, media, and methods for the production of poems; it also transformed the interface of literature and the act of reading through immersion in sound, through the activation of different cognitive modes, and through an intersensorial address. On the one hand, this literary and artistic output can be seen as a response to the increasing intermedation (in Katherine Hayles's sense in culture and society during the last century. On the other hand, we might, as contemporary readers, return to these poetic works in order to use them as media archaeological tools that might shed light on the aesthetic transformations taking place within new media today.

  6. Applying a rateless code in content delivery networks

    Science.gov (United States)

    Suherman; Zarlis, Muhammad; Parulian Sitorus, Sahat; Al-Akaidi, Marwan

    2017-09-01

    Content delivery network (CDN) allows internet providers to locate their services, to map their coverage into networks without necessarily to own them. CDN is part of the current internet infrastructures, supporting multi server applications especially social media. Various works have been proposed to improve CDN performances. Since accesses on social media servers tend to be short but frequent, providing redundant to the transmitted packets to ensure lost packets not degrade the information integrity may improve service performances. This paper examines the implementation of rateless code in the CDN infrastructure. The NS-2 evaluations show that rateless code is able to reduce packet loss up to 50%.

  7. A New Class of TAST Codes With A Simplified Tree Structure

    CERN Document Server

    Damen, Mohamed Oussama; Badr, Ahmed A

    2010-01-01

    We consider in this paper the design of full diversity and high rate space-time codes with moderate decoding complexity for arbitrary number of transmit and receive antennas and arbitrary input alphabets. We focus our attention to codes from the threaded algebraic space-time (TAST) framework since the latter includes most known full diversity space-time codes. We propose a new construction of the component single-input single-output (SISO) encoders such that the equivalent code matrix has an upper triangular form. We accomplish this task by designing each SISO encoder to create an ISI-channel in each thread. This, in turn, greatly simplifies the QR-decomposition of the composite channel and code matrix, which is essential for optimal or near-optimal tree search algorithms, such as the sequential decoder.

  8. Feasibility Study of Core Design with a Monte Carlo Code for APR1400 Initial core

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jinsun; Chang, Do Ik; Seong, Kibong [KEPCO NF, Daejeon (Korea, Republic of)

    2014-10-15

    The Monte Carlo calculation becomes more popular and useful nowadays due to the rapid progress in computing power and parallel calculation techniques. There have been many attempts to analyze a commercial core by Monte Carlo transport code using the enhanced computer capability, recently. In this paper, Monte Carlo calculation of APR1400 initial core has been performed and the results are compared with the calculation results of conventional deterministic code to find out the feasibility of core design using Monte Carlo code. SERPENT, a 3D continuous-energy Monte Carlo reactor physics burnup calculation code is used for this purpose and the KARMA-ASTRA code system, which is used for a deterministic code of comparison. The preliminary investigation for the feasibility of commercial core design with Monte Carlo code was performed in this study. Simplified core geometry modeling was performed for the reactor core surroundings and reactor coolant model is based on two region model. The reactivity difference at HZP ARO condition between Monte Carlo code and the deterministic code is consistent with each other and the reactivity difference during the depletion could be reduced by adopting the realistic moderator temperature. The reactivity difference calculated at HFP, BOC, ARO equilibrium condition was 180 ±9 pcm, with axial moderator temperature of a deterministic code. The computing time will be a significant burden at this time for the application of Monte Carlo code to the commercial core design even with the application of parallel computing because numerous core simulations are required for actual loading pattern search. One of the remedy will be a combination of Monte Carlo code and the deterministic code to generate the physics data. The comparison of physics parameters with sophisticated moderator temperature modeling and depletion will be performed for a further study.

  9. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    Science.gov (United States)

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  10. Application of coupled code technique to a safety analysis of a standard MTR research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hamidouche, Tewfik [Division de l' Environnement, de la Surete et des Dechets Radioactifs, Centre de Recherche Nucleaire d' Alger (CRNA), Alger (Algeria); Laboratoire de Mecanique des Fluides Theorique et Appliquee, Faculte de Physique, Universite Des Sciences et de la Technologie Houari Boumediene, (USTHB), Bab-Ezzouar, Alger (Algeria)], E-mail: t.hamidouche@crna.dz; Bousbia-Salah, Anis [Dipartimento di Ingegneria Meccanica, Nucleari e della Produzione-Facolta di Ingegneria, Universita di Pisa, Pisa (Italy)], E-mail: b.salah@ing.unipi.it; Si-Ahmed, El Khider [Laboratoire de Mecanique des Fluides Theorique et Appliquee, Faculte de Physique, Universite Des Sciences et de la Technologie Houari Boumediene, (USTHB), Bab-Ezzouar, Alger (Algeria)], E-mail: esi-ahmed@usthb.dz; Mokeddem, Mohamed Yazid [Division de la Physique et des Applications Nucleaires, Centre de Recherche Nucleaire de Draria (CRND) (Algeria); D' Auria, Franscesco [Dipartimento di Ingegneria Meccanica, Nucleari e della Produzione-Facolta di Ingegneria, Universita di Pisa, Pisa (Italy)

    2009-10-15

    Accident analyses in nuclear research reactors have been performed, up to now, using simple computational tools based on conservative physical models. These codes, developed to focus on specific phenomena in the reactor, were widely used for licensing purposes. Nowadays, the advances in computer technology make it possible to switch to a new generation of computational tools that provides more realistic description of the phenomena occurring in a nuclear research reactor. Recent International Atomic Energy Agency (IAEA) activities have emphasized the maturity in using Best Estimate (BE) Codes in the analysis of accidents in research reactors. Indeed, some assessments have already been performed using BE thermal-hydraulic system codes such as RELAP5/Mod3. The challenge today is oriented to the application of coupled code techniques for research reactors safety analyses. Within the framework of the current study, a Three-Dimensional Neutron Kinetics Thermal-Hydraulic Model (3D-NKTH) based on coupled PARCS and RELAP5/Mod3.3 codes has been developed for the IAEA High Enriched Uranium (HEU) benchmark core. The results of the steady state calculations are sketched by comparison to tabulated results issued from the IAEA TECDOC 643. These data were obtained using conventional diffusion codes as well as Monte Carlo codes. On the other hand, the transient analysis was assessed with conventional coupled point kinetics-thermal-hydraulic channel codes such as RELAP5 stand alone, RETRAC-PC, and PARET codes. Through this study, the applicability of the coupled code technique is emphasized with an outline of some remaining challenges.

  11. New Class of Quantum Error-Correcting Codes for a Bosonic Mode

    Science.gov (United States)

    Michael, Marios H.; Silveri, Matti; Brierley, R. T.; Albert, Victor V.; Salmilehto, Juha; Jiang, Liang; Girvin, S. M.

    2016-07-01

    We construct a new class of quantum error-correcting codes for a bosonic mode, which are advantageous for applications in quantum memories, communication, and scalable computation. These "binomial quantum codes" are formed from a finite superposition of Fock states weighted with binomial coefficients. The binomial codes can exactly correct errors that are polynomial up to a specific degree in bosonic creation and annihilation operators, including amplitude damping and displacement noise as well as boson addition and dephasing errors. For realistic continuous-time dissipative evolution, the codes can perform approximate quantum error correction to any given order in the time step between error detection measurements. We present an explicit approximate quantum error recovery operation based on projective measurements and unitary operations. The binomial codes are tailored for detecting boson loss and gain errors by means of measurements of the generalized number parity. We discuss optimization of the binomial codes and demonstrate that by relaxing the parity structure, codes with even lower unrecoverable error rates can be achieved. The binomial codes are related to existing two-mode bosonic codes, but offer the advantage of requiring only a single bosonic mode to correct amplitude damping as well as the ability to correct other errors. Our codes are similar in spirit to "cat codes" based on superpositions of the coherent states but offer several advantages such as smaller mean boson number, exact rather than approximate orthonormality of the code words, and an explicit unitary operation for repumping energy into the bosonic mode. The binomial quantum codes are realizable with current superconducting circuit technology, and they should prove useful in other quantum technologies, including bosonic quantum memories, photonic quantum communication, and optical-to-microwave up- and down-conversion.

  12. Consensus Coding as a Tool in Visual Appearance Research

    Directory of Open Access Journals (Sweden)

    D R Simmons

    2011-04-01

    Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.

  13. Adding Drift Kinetics to a Global MHD Code

    Science.gov (United States)

    Lyon, J.; Merkin, V. G.; Zhang, B.; Ouellette, J.

    2015-12-01

    Global MHD models have generally been successful in describing thebehavior of the magnetosphere at large and meso-scales. An exceptionis the inner magnetosphere where energy dependent particle drifts areessential in the dynamics and evolution of the ring current. Even inthe tail particle drifts are a significant perturbation on the MHDbehavior of the plasma. The most common drift addition to MHD has beeninclusion of the Hall term in Faraday's Law. There have been attemptsin the space physics context to include gradient and curvature driftswithin a single fluid MHD picture. These have not been terriblysuccessful because the use of a single, Maxwellian distribution doesnot capture the energy dependent nature of the drifts. The advent ofmulti-fluid MHD codes leads to a reconsideration of this problem. TheVlasov equation can be used to define individual ``species'' whichcover a specific energy range. Each fluid can then be treated ashaving a separate evolution. We take the approach of the RiceConvection Model (RCM) that each energy channel can be described by adistribution that is essentially isotropic in the guiding centerpicture. In the local picture, this gives rise to drifts that can bedescribed in terms of the energy dependent inertial and diamagneticdrifts. By extending the MHD equations with these drifts we can get asystem which reduces to the RCM approach in the slow-flow innermagnetosphere but is not restricted to cases where the flow speed issmall. The restriction is that the equations can be expanded in theratio of the Larmor radius to the gradient scale lengths. At scalesapproaching di, the assumption of gyrotropic (or isotropic)distributions break down. In addition to the drifts, the formalism canalso be used to include finite Larmor radius effects on the pressuretensor (gyro-viscosity). We present some initial calculations with this method.

  14. Evolution of the genetic code: partial optimization of a random code for robustness to translation error in a rugged fitness landscape

    National Research Council Canada - National Science Library

    Novozhilov, Artem S; Wolf, Yuri I; Koonin, Eugene V

    2007-01-01

    The standard genetic code table has a distinctly non-random structure, with similar amino acids often encoded by codons series that differ by a single nucleotide substitution, typically, in the third...

  15. Easy as Pi: A Network Coding Raspberry Pi Testbed

    DEFF Research Database (Denmark)

    W. Sørensen, Chres; Hernandez Marcano, Nestor Javier; Cabrera Guerrero, Juan A.

    2016-01-01

    of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications...

  16. A Normative Code of Conduct for Admissions Officers

    Science.gov (United States)

    Hodum, Robert L.

    2012-01-01

    The increasing competition for the desired quantity and quality of college students, along with the rise of for-profit institutions, has amplified the scrutiny of behavior and ethics among college admissions professionals and has increased the need for meaningful ethical guidelines and codes of conduct. Many other areas of responsibility within…

  17. Entropy, Coding a· nd Data Compression

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy, Coding and Data Compression. S Natarajan. General Article Volume 6 Issue 9 September 2001 pp 35-45. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/09/0035-0045 ...

  18. 10 CFR 50.55a - Codes and standards.

    Science.gov (United States)

    2010-01-01

    ... Retaining Welds in Class 1 Components Fabricated with Alloy 600/82/182 Materials, Section XI, Division 1..., tested, and inspected to quality standards commensurate with the importance of the safety function to be... Guide 1.84, Revision 34, “Design, Fabrication, and Materials Code Case Acceptability, ASME Section III...

  19. PCTRAN: a transient analysis code for personal computers

    Energy Technology Data Exchange (ETDEWEB)

    Lichi Cliff Po

    1988-05-01

    The PCTRAN code has been developed to enable analysis and real-time reactor simulation to be carried out on personal computers. It is designed to exploit all the advantages of personal computers, including accessibility, interactive capabilities, convenience, economy and the ability to get certain kinds of analysis performed at short notice.

  20. Verification & Validation Toolkit to Assess Codes: Is it Theory Limitation, Numerical Method Inadequacy, Bug in the Code or a Serious Flaw?

    Science.gov (United States)

    Bombardelli, F. A.; Zamani, K.

    2014-12-01

    We introduce and discuss an open-source, user friendly, numerical post-processing piece of software to assess reliability of the modeling results of environmental fluid mechanics' codes. Verification and Validation, Uncertainty Quantification (VAVUQ) is a toolkit developed in Matlab© for general V&V proposes. In this work, The VAVUQ implementation of V&V techniques and user interfaces would be discussed. VAVUQ is able to read Excel, Matlab, ASCII, and binary files and it produces a log of the results in txt format. Next, each capability of the code is discussed through an example: The first example is the code verification of a sediment transport code, developed with the Finite Volume Method, with MES. Second example is a solution verification of a code for groundwater flow, developed with the Boundary Element Method, via MES. Third example is a solution verification of a mixed order, Compact Difference Method code of heat transfer via MMS. Fourth example is a solution verification of a 2-D, Finite Difference Method code of floodplain analysis via Complete Richardson Extrapolation. In turn, application of VAVUQ in quantitative model skill assessment studies (validation) of environmental codes is given through two examples: validation of a two-phase flow computational modeling of air entrainment in a free surface flow versus lab measurements and heat transfer modeling in the earth surface versus field measurement. At the end, we discuss practical considerations and common pitfalls in interpretation of V&V results.

  1. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  2. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  3. A Compressed Sensing-Based Low-Density Parity-Check Real-Number Code

    Directory of Open Access Journals (Sweden)

    Zaixing He

    2013-09-01

    Full Text Available In this paper, we propose a novel low-density parity-check real-number code, based on compressed sensing. A real-valued message is encoded by a coding matrix (with more rows than columns and transmitted over an erroneous channel, where sparse errors (impulsive noise corrupt the codeword. In the decoding procedure, we apply a structured sparse (low-density parity-check matrix, the Permuted Block Diagonal matrix, to the corrupted output, and the errors can be corrected by solving a compressed sensing problem. A compressed sensing algorithm, Cross Low-dimensional Pursuit, is used to decode the code by solving this compressed sensing problem. The proposed code has high error correction performance and decoding efficiency. The comparative experimental results demonstrate both advantages of our code. We also apply our code to cryptography.

  4. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  5. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Science.gov (United States)

    Zhang, Ai-bing; Feng, Jie; Ward, Robert D; Wan, Ping; Gao, Qiang; Wu, Jun; Zhao, Wei-zhong

    2012-01-01

    Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI) region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS) genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF) to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish) and two representing non-coding ITS barcodes (rust fungi and brown algae). Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ) and Maximum likelihood (ML) methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI) of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40%) for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37%) for 1094 brown algae queries, both using ITS barcodes.

  6. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Directory of Open Access Journals (Sweden)

    Ai-bing Zhang

    Full Text Available Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish and two representing non-coding ITS barcodes (rust fungi and brown algae. Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ and Maximum likelihood (ML methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40% for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37% for 1094 brown algae queries, both using ITS barcodes.

  7. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  8. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  9. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    Science.gov (United States)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  10. Mapa-an object oriented code with a graphical user interface for accelerator design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shasharina, S.G.; Cary, J.R. [Tech-X Corporation 4588 Pussy Willow Court, Boulder, Colorado 80301 (United States)

    1997-02-01

    We developed a code for accelerator modeling which will allow users to create and analyze accelerators through a graphical user interface (GUI). The GUI can read an accelerator from files or create it by adding, removing and changing elements. It also creates 4D orbits and lifetime plots. The code includes a set of accelerator elements classes, C++ utility and GUI libraries. Due to the GUI, the code is easy to use and expand. {copyright} {ital 1997 American Institute of Physics.}

  11. Code flid (dep 051). A code for the two-dimensional analysis of the thermodynamic behaviour of a boiling liquid; Code flid (dep 051). Programme numerique analysant en deux dimensions le comportement thermodynamique d'un liquide bouillant

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M.; Saunier, J.P. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires. Departement des etudes de piles, service de physique mathematique

    1967-01-01

    This two-dimensional code handles the following problems: 1. Analysis of thermal and experiments on a water-loop at high or low pressure, steady state or transient behaviour. 2. Analysis of thermal and hydrodynamic behaviour of a light water reactor hot channel, The fuel elements are assumed to be flat plates. The power and pressure drop variations are obtained from the complementary one-dimensional code CACTUS (CEA report R-3039). (authors) [French] Ce code bidimensionnel permet de traiter les problemes suivants: 1. Depouillement d'essais thermiques sur boucle a eau, haute et basse pression, regime permanent ou transitoire. 2. Etude thermique et hydraulique du canal chaud d'un reacteur a eau, a plaques. Les lois de variation de puissance ou de perte de pression imposee aux bornes du canal au cours d'un transitoire sont obtenues par le code complementaire unidimensionnel CACTUS (rapport CEA-R 3039). (auteurs)

  12. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  13. QR code based noise-free optical encryption and decryption of a gray scale image

    Science.gov (United States)

    Jiao, Shuming; Zou, Wenbin; Li, Xia

    2017-03-01

    In optical encryption systems, speckle noise is one major challenge in obtaining high quality decrypted images. This problem can be addressed by employing a QR code based noise-free scheme. Previous works have been conducted for optically encrypting a few characters or a short expression employing QR codes. This paper proposes a practical scheme for optically encrypting and decrypting a gray-scale image based on QR codes for the first time. The proposed scheme is compatible with common QR code generators and readers. Numerical simulation results reveal the proposed method can encrypt and decrypt an input image correctly.

  14. Generating code adapted for interlinking legacy scalar code and extended vector code

    Science.gov (United States)

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  15. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  16. A genetic code alteration is a phenotype diversity generator in the human pathogen Candida albicans.

    Directory of Open Access Journals (Sweden)

    Isabel Miranda

    Full Text Available BACKGROUND: The discovery of genetic code alterations and expansions in both prokaryotes and eukaryotes abolished the hypothesis of a frozen and universal genetic code and exposed unanticipated flexibility in codon and amino acid assignments. It is now clear that codon identity alterations involve sense and non-sense codons and can occur in organisms with complex genomes and proteomes. However, the biological functions, the molecular mechanisms of evolution and the diversity of genetic code alterations remain largely unknown. In various species of the genus Candida, the leucine CUG codon is decoded as serine by a unique serine tRNA that contains a leucine 5'-CAG-3'anticodon (tRNA(CAG(Ser. We are using this codon identity redefinition as a model system to elucidate the evolution of genetic code alterations. METHODOLOGY/PRINCIPAL FINDINGS: We have reconstructed the early stages of the Candida genetic code alteration by engineering tRNAs that partially reverted the identity of serine CUG codons back to their standard leucine meaning. Such genetic code manipulation had profound cellular consequences as it exposed important morphological variation, altered gene expression, re-arranged the karyotype, increased cell-cell adhesion and secretion of hydrolytic enzymes. CONCLUSION/SIGNIFICANCE: Our study provides the first experimental evidence for an important role of genetic code alterations as generators of phenotypic diversity of high selective potential and supports the hypothesis that they speed up evolution of new phenotypes.

  17. Clustering of neural code words revealed by a first-order phase transition.

    Science.gov (United States)

    Huang, Haiping; Toyoizumi, Taro

    2016-06-01

    A network of neurons in the central nervous system collectively represents information by its spiking activity states. Typically observed states, i.e., code words, occupy only a limited portion of the state space due to constraints imposed by network interactions. Geometrical organization of code words in the state space, critical for neural information processing, is poorly understood due to its high dimensionality. Here, we explore the organization of neural code words using retinal data by computing the entropy of code words as a function of Hamming distance from a particular reference codeword. Specifically, we report that the retinal code words in the state space are divided into multiple distinct clusters separated by entropy-gaps, and that this structure is shared with well-known associative memory networks in a recallable phase. Our analysis also elucidates a special nature of the all-silent state. The all-silent state is surrounded by the densest cluster of code words and located within a reachable distance from most code words. This code-word space structure quantitatively predicts typical deviation of a state-trajectory from its initial state. Altogether, our findings reveal a non-trivial heterogeneous structure of the code-word space that shapes information representation in a biological network.

  18. Clustering of neural code words revealed by a first-order phase transition

    Science.gov (United States)

    Huang, Haiping; Toyoizumi, Taro

    2016-06-01

    A network of neurons in the central nervous system collectively represents information by its spiking activity states. Typically observed states, i.e., code words, occupy only a limited portion of the state space due to constraints imposed by network interactions. Geometrical organization of code words in the state space, critical for neural information processing, is poorly understood due to its high dimensionality. Here, we explore the organization of neural code words using retinal data by computing the entropy of code words as a function of Hamming distance from a particular reference codeword. Specifically, we report that the retinal code words in the state space are divided into multiple distinct clusters separated by entropy-gaps, and that this structure is shared with well-known associative memory networks in a recallable phase. Our analysis also elucidates a special nature of the all-silent state. The all-silent state is surrounded by the densest cluster of code words and located within a reachable distance from most code words. This code-word space structure quantitatively predicts typical deviation of a state-trajectory from its initial state. Altogether, our findings reveal a non-trivial heterogeneous structure of the code-word space that shapes information representation in a biological network.

  19. LOLA SYSTEM: A code block for nodal PWR simulation. Part. II - MELON-3, CONCON and CONAXI Codes

    Energy Technology Data Exchange (ETDEWEB)

    Aragones, J. M.; Ahnert, C.; Gomez Santamaria, J.; Rodriguez Olabarria, I.

    1985-07-01

    Description of the theory and users manual of the MELON-3, CONCON and CONAXI codes, which are part of the core calculation system by nodal theory in one group, called LOLA SYSTEM. These auxiliary codes, provide some of the input data for the main module SIMULA-3; these are, the reactivity correlations constants, the albe does and the transport factors. (Author) 7 refs.

  20. A comparison of neutron spectrum unfolding codes used with a miniature NE213 detector

    CERN Document Server

    Koohi-Fayegh, R; Scott, M C

    2001-01-01

    The effects of unfolding technique on neutron spectra measured with a miniature NE-213 spectrometer are investigated. The codes used were FORIST, FERDOR and RADAK, a differential code FLYSPEC and one developed by the authors based on Neural Networks. The characteristics required of experimental test spectra were that they be structured, well known and have a significant component above 10 MeV. Four different test spectra were employed. It is found that all the codes performed well with the test spectra used, producing generally consistent results.

  1. Codes for a priority queue on a parallel data bus. [Deep Space Network

    Science.gov (United States)

    Wallis, D. E.; Taylor, H.

    1979-01-01

    Some codes for arbitration of priorities among subsystem computers or peripheral device controllers connected to a parallel data bus are described. At arbitration time, several subsystems present wire-OR, parallel code words to the bus, and the central computer can identify the subsystem of highest priority and determine which of two or more transmission services the subsystem requires. A mathematical discussion of the optimality of the codes with regard to the number of subsystems that may participate in the scheme for a given number of wires is presented along with the number of services that each subsystem may request.

  2. A Statistical Analysis of the Robustness of Alternate Genetic Coding Tables

    Directory of Open Access Journals (Sweden)

    Isil Aksan Kurnaz

    2008-05-01

    Full Text Available The rules that specify how the information contained in DNA is translated into amino acid “language” during protein synthesis are called “the genetic code”, commonly called the “Standard” or “Universal” Genetic Code Table. As a matter of fact, this coding table is not at all “universal”: in addition to different genetic code tables used by different organisms, even within the same organism the nuclear and mitochondrial genes may be subject to two different coding tables. Results In an attempt to understand the advantages and disadvantages these coding tables may bring to an organism, we have decided to analyze various coding tables on genes subject to mutations, and have estimated how these genes “survive” over generations. We have used this as indicative of the “evolutionary” success of that particular coding table. We find that the “standard” genetic code is not actually the most robust of all coding tables, and interestingly, Flatworm Mitochondrial Code (FMC appears to be the highest ranking coding table given our assumptions. Conclusions It is commonly hypothesized that the more robust a genetic code, the better suited it is for maintenance of the genome. Our study shows that, given the assumptions in our model, Standard Genetic Code is quite poor when compared to other alternate code tables in terms of robustness. This brings about the question of why Standard Code has been so widely accepted by a wider variety of organisms instead of FMC, which needs to be addressed for a thorough understanding of genetic code evolution.

  3. Development and testing of a Monte Carlo code system for analysis of ionization chamber responses

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J.O.; Gabriel, T.A.

    1986-01-01

    To predict the perturbation of interactions between radiation and material by the presence of a detector, a differential Monte Carlo computer code system entitled MICAP was developed and tested. This code system determines the neutron, photon, and total response of an ionization chamber to mixed field radiation environments. To demonstrate the ability of MICAP in calculating an ionization chamber response function, a comparison was made to 05S, an established Monte Carlo code extensively used to accurately calibrate liquid organic scintillators. Both code systems modeled an organic scintillator with a parallel beam of monoenergetic neutrons incident on the scintillator. (LEW)

  4. Static Code Analysis: A Systematic Literature Review and an Industrial Survey

    OpenAIRE

    Ilyas, Bilal; Elkhalifa, Islam

    2016-01-01

    Context: Static code analysis is a software verification technique that refers to the process of examining code without executing it in order to capture defects in the code early, avoiding later costly fixations. The lack of realistic empirical evaluations in software engineering has been identified as a major issue limiting the ability of research to impact industry and in turn preventing feedback from industry that can improve, guide and orient research. Studies emphasized rigor and relevan...

  5. A Novel Microstrip Frequency Discriminator for IFM Based on Balanced Gray-code

    OpenAIRE

    de Oliveira, Elias M.F.; Pedrosa, Túlio L.; de Souza, S.R.O.; Melo, M. T. de; Oliveira, B. G. M. de; Llamas-Garro, Ignacio

    2017-01-01

    This work presents the design, simulation, fabrication and measurement of a novel set of microstrip filters to perform the task of frequency discriminators. These filters’ frequency responses are based on the balanced Gray-code. Results show that the use of the balanced Gray-code, as opposed to the traditional Gray-code, allowed 20% circuit size reduction by using 60% less resonators due to a change in the resonators’ orientation.

  6. Expanding and engineering the genetic code in a single expression experiment.

    Science.gov (United States)

    Hoesl, Michael G; Budisa, Nediljko

    2011-03-07

    Expanding and engineering the code simultaneously: This concept was experimentally realized in a single in vivo expression experiment whereby residue-specific, sense codon reassignments Met→Nle/Pro→(4S-F)Pro (code engineering) were combined with position-specific STOP→Bpa read-through by an amber suppressor tRNA (code expansion). Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. SITA version 0. A simulation and code testing assistant for TOUGH2 and MARNIE

    Energy Technology Data Exchange (ETDEWEB)

    Seher, Holger; Navarro, Martin

    2016-06-15

    High quality standards have to be met by those numerical codes that are applied in long-term safety assessments for deep geological repositories for radioactive waste. The software environment SITA (''a simulation and code testing assistant for TOUGH2 and MARNIE'') has been developed by GRS in order to perform automated regression testing for the flow and transport simulators TOUGH2 and MARNIE. GRS uses the codes TOUGH2 and MARNIE in order to assess the performance of deep geological repositories for radioactive waste. With SITA, simulation results of TOUGH2 and MARNIE can be compared to analytical solutions and simulations results of other code versions. SITA uses data interfaces to operate with codes whose input and output depends on the code version. The present report is part of a wider GRS programme to assure and improve the quality of TOUGH2 and MARNIE. It addresses users as well as administrators of SITA.

  8. The weight hierarchies and chain condition of a class of codes from varieties over finite fields

    Science.gov (United States)

    Wu, Xinen; Feng, Gui-Liang; Rao, T. R. N.

    1996-01-01

    The generalized Hamming weights of linear codes were first introduced by Wei. These are fundamental parameters related to the minimal overlap structures of the subcodes and very useful in several fields. It was found that the chain condition of a linear code is convenient in studying the generalized Hamming weights of the product codes. In this paper we consider a class of codes defined over some varieties in projective spaces over finite fields, whose generalized Hamming weights can be determined by studying the orbits of subspaces of the projective spaces under the actions of classical groups over finite fields, i.e., the symplectic groups, the unitary groups and orthogonal groups. We give the weight hierarchies and generalized weight spectra of the codes from Hermitian varieties and prove that the codes satisfy the chain condition.

  9. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  10. Source Term Code Package: a user's guide (Mod 1)

    Energy Technology Data Exchange (ETDEWEB)

    Gieseke, J.A.; Cybulskis, P.; Jordan, H.; Lee, K.W.; Schumacher, P.M.; Curtis, L.A.; Wooton, R.O.; Quayle, S.F.; Kogan, V.

    1986-07-01

    As part of a major reassessment of the release of radioactive materials to the environment (source terms) in severe reactor accidents, a group of state-of-the-art computer codes was utilized to perform extensive analyses. A major product of this source term reassessment effort was a demonstrated methodology for analyzing specific accident situations to provide source term predictions. The computer codes forming this methodology have been upgraded and modified for release and further use. This system of codes has been named the Source Term Code Package (STCP) and is the subject of this user's guide. The guide is intended to provide an understanding of the STCP structure and to facilitate STCP use. The STCP was prepared for operation on a CDC system but is written in FORTRAN-77 to permit transportability. In the current version (Mod 1) of the STCP, the various calculational elements fall into four major categories represented by the codes MARCH3, TRAP-MELT3, VANESA, and NAUA/SPARC/ICEDF. The MARCH3 code is a combination of the MARCH2, CORSOR-M, and CORCON-Mod 2 codes. The TRAP-MELT3 code is a combination of the TRAP-MELT2.0 and MERGE codes.

  11. DOGS: a collection of graphics for support of discrete ordinates codes

    Energy Technology Data Exchange (ETDEWEB)

    Ingersoll, D.T.; Slater, C.O.

    1980-03-01

    A collection of computer codes called DOGS (Discrete Ordinates Graphics Support) has been developed to assist in the display and presentation of data generated by commonly used discrete ordinates transport codes. The DOGS codes include: EGAD for plotting two-dimensional geometries, ISOPLOT4 for plotting 2-D fluxes in a contour line fashion, FORM for plotting 2-D fluxes in a 3-D surface fashion, ACTUAL for calculating 2-D activities, TOOTH for calculating and plotting space-energy contributon fluxes, and ASPECT for plotting energy spectra. All of the codes use FIDO input formats and DISSPLA graphics software including the DISSPOP post processors.

  12. ABAREX -- A neutron spherical optical-statistical-model code -- A user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Smith, A.B. [ed.; Lawson, R.D.

    1998-06-01

    The contemporary version of the neutron spherical optical-statistical-model code ABAREX is summarized with the objective of providing detailed operational guidance for the user. The physical concepts involved are very briefly outlined. The code is described in some detail and a number of explicit examples are given. With this document one should very quickly become fluent with the use of ABAREX. While the code has operated on a number of computing systems, this version is specifically tailored for the VAX/VMS work station and/or the IBM-compatible personal computer.

  13. A Silent Revolution: From Sketching to Coding--A Case Study on Code-Based Design Tool Learning

    Science.gov (United States)

    Xu, Song; Fan, Kuo-Kuang

    2017-01-01

    Along with the information technology rising, Computer Aided Design activities are becoming more modern and more complex. But learning how to operation these new design tools has become the main problem lying in front of each designer. This study was purpose on finding problems encountered during code-based design tools learning period of…

  14. Ducted-Fan Engine Acoustic Predictions using a Navier-Stokes Code

    Science.gov (United States)

    Rumsey, C. L.; Biedron, R. T.; Farassat, F.; Spence, P. L.

    1998-01-01

    A Navier-Stokes computer code is used to predict one of the ducted-fan engine acoustic modes that results from rotor-wake/stator-blade interaction. A patched sliding-zone interface is employed to pass information between the moving rotor row and the stationary stator row. The code produces averaged aerodynamic results downstream of the rotor that agree well with a widely used average-passage code. The acoustic mode of interest is generated successfully by the code and is propagated well upstream of the rotor; temporal and spatial numerical resolution are fine enough such that attenuation of the signal is small. Two acoustic codes are used to find the far-field noise. Near-field propagation is computed by using Eversman's wave envelope code, which is based on a finite-element model. Propagation to the far field is accomplished by using the Kirchhoff formula for moving surfaces with the results of the wave envelope code as input data. Comparison of measured and computed far-field noise levels show fair agreement in the range of directivity angles where the peak radiation lobes from the inlet are observed. Although only a single acoustic mode is targeted in this study, the main conclusion is a proof-of-concept: Navier-Stokes codes can be used both to generate and propagate rotor/stator acoustic modes forward through an engine, where the results can be coupled to other far-field noise prediction codes.

  15. Exosomal non-coding RNAs: a promising cancer biomarker.

    Science.gov (United States)

    Yang, Huan; Fu, Hailong; Xu, Wenrong; Zhang, Xu

    2016-12-01

    Novel and non-invasive biomarkers are urgently needed for early detection of cancer. Exosomes are nano-sized particles released by cells and contain various bioactive molecules including proteins, DNA, mRNAs, and non-coding RNAs. Increasing evidence suggests that exosomes play critical roles in tumorigenesis, tumor growth, metastasis, and therapy resistance. Exosomes could be readily accessible in nearly all the body fluids. The altered production of exosomes and aberrant expression of exosomal contents could reflect the pathological state of the body, indicating that exosomes and exosomal contents can be utilized as novel cancer biomarkers. Herein, we review the basic properties of exosomes, the functional roles of exosomes in cancer, and the methods of detecting exosomes and exosomal contents. In particular, we highlight the clinical values of exosomal non-coding RNAs in cancer diagnosis and prognosis.

  16. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  17. Occupational self-coding and automatic recording (OSCAR): a novel web-based tool to collect and code lifetime job histories in large population-based studies.

    Science.gov (United States)

    De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul

    2017-03-01

    Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.

  18. The Code of the Street and Romantic Relationships: A dyadic analysis

    Science.gov (United States)

    Barr, Ashley B.; Simons, Ronald L.; Stewart, Eric A.

    2012-01-01

    Since its publication, Elijah Anderson’s (1999) code of the street thesis has found support in studies connecting disadvantage to the internalization of street-oriented values and an associated lifestyle of violent/deviant behavior. This primary emphasis on deviance in public arenas has precluded researchers from examining the implications of the code of the street for less public arenas, like intimate relationships. In an effort to understand if and how the endorsement of the street code may infiltrate such relationships, the present study examines the associations between the code of the street and relationship satisfaction and commitment among young adults involved in heterosexual romantic relationships. Using a dyadic approach, we find that street code orientation, in general, negatively predicts satisfaction and commitment, in part due to increased relationship hostility/conflict associated with the internalization of the code. Gender differences in these associations are considered and discussed at length. PMID:23504000

  19. A proto-code of ethics and conduct for European nurse directors.

    Science.gov (United States)

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  20. A Pragmatic Approach to the Application of the Code of Ethics in Nursing Education.

    Science.gov (United States)

    Tinnon, Elizabeth; Masters, Kathleen; Butts, Janie

    The code of ethics for nurses was written for nurses in all settings. However, the language focuses primarily on the nurse in context of the patient relationship, which may make it difficult for nurse educators to internalize the code to inform practice. The purpose of this article is to explore the code of ethics, establish that it can be used to guide nurse educators' practice, and provide a pragmatic approach to application of the provisions.

  1. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    OpenAIRE

    Tracy Waize; Sobanna Anandarajah; Neil Dhoul; Simon de Lusignan

    2007-01-01

    Background Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. Objective To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. Method General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultatio...

  2. De-randomizing Shannon: The Design and Analysis of a Capacity-Achieving Rateless Code

    CERN Document Server

    Balakrishnan, Hari; Perry, Jonathan; Shah, Devavrat

    2012-01-01

    This paper presents an analysis of spinal codes, a class of rateless codes proposed recently. We prove that spinal codes achieve Shannon capacity for the binary symmetric channel (BSC) and the additive white Gaussian noise (AWGN) channel with an efficient polynomial-time encoder and decoder. They are the first rateless codes with proofs of these properties for BSC and AWGN. The key idea in the spinal code is the sequential application of a hash function over the message bits. The sequential structure of the code turns out to be crucial for efficient decoding. Moreover, counter to the wisdom of having an expander structure in good codes, we show that the spinal code, despite its sequential structure, achieves capacity. The pseudo-randomness provided by a hash function suffices for this purpose. Our proof introduces a variant of Gallager's result characterizing the error exponent of random codes for any memoryless channel. We present a novel application of these error-exponent results within the framework of an...

  3. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Son, Han Seong; Song, Deok Yong [ENESYS, Taejon (Korea, Republic of); Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of)

    2006-07-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors.

  4. Implementation of a 3D mixing layer code on parallel computers

    Science.gov (United States)

    Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.

    1995-01-01

    This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.

  5. Performance Analysis of a CDMA VSAT System With Convoltional and Reed-Solomon Coding

    Science.gov (United States)

    Yigit, Ugur

    2002-09-01

    The purpose of this thesis is to model a satellite communication system with VSATs, using Spread Spectrum CDMA methods and Forward Error Correction (FEC), Walsh codes and PN sequences are used to generate a CDMA system and FEC is used to further improve the performance. Convolutional and block coding methods are examined and the results are obtained for each different case, including concatenated use of the codes, The performance of the system is given in terms of Bit Error Rate (BER), As observed from the results, the performance is mainly affected by the number of users and the code rates,

  6. Ideas for Advancing Code Sharing: A Different Kind of Hack Day

    Science.gov (United States)

    Teuben, P.; Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Wallin, J. F.

    2014-05-01

    How do we as a community encourage the reuse of software for telescope operations, data processing, and ? How can we support making codes used in research available for others to examine? Continuing the discussion from last year Bring out your codes! BoF session, participants separated into groups to brainstorm ideas to mitigate factors which inhibit code sharing and nurture those which encourage code sharing. The BoF concluded with the sharing of ideas that arose from the brainstorming sessions and a brief summary by the moderator.

  7. Development of a model and computer code to describe solar grade silicon production processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gould, R K; Srivastava, R

    1979-12-01

    Models and computer codes which may be used to describe flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides are described. A prominent example of the type of process which may be studied using the codes developed in this program is the SiCl/sub 4//Na reactor currently being developed by the Westinghouse Electric Corp. During this program two large computer codes were developed. The first is the CHEMPART code, an axisymmetric, marching code which treats two-phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. This code, based on the AeroChem LAPP (Low Altitude Plume Program) code can be used to describe flow reactors in which reactants mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, depositon of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail than can be afforded using CHEMPART. These two codes have been used in this program to predict particle formation characteristics and wall collection efficiencies for SiCl/sub 4//Na flow reactors. Results are described.

  8. Monomial-like codes

    CERN Document Server

    Martinez-Moro, Edgar; Ozbudak, Ferruh; Szabo, Steve

    2010-01-01

    As a generalization of cyclic codes of length p^s over F_{p^a}, we study n-dimensional cyclic codes of length p^{s_1} X ... X p^{s_n} over F_{p^a} generated by a single "monomial". Namely, we study multi-variable cyclic codes of the form in F_{p^a}[x_1...x_n] / . We call such codes monomial-like codes. We show that these codes arise from the product of certain single variable codes and we determine their minimum Hamming distance. We determine the dual of monomial-like codes yielding a parity check matrix. We also present an alternative way of constructing a parity check matrix using the Hasse derivative. We study the weight hierarchy of certain monomial like codes. We simplify an expression that gives us the weight hierarchy of these codes.

  9. LAVENDER: A steady-state core analysis code for design studies of accelerator driven subcritical reactors

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shengcheng; Wu, Hongchun; Cao, Liangzhi; Zheng, Youqi, E-mail: yqzheng@mail.xjtu.edu.cn; Huang, Kai; He, Mingtao; Li, Xunzhao

    2014-10-15

    Highlights: • A new code system for design studies of accelerator driven subcritical reactors (ADSRs) is developed. • S{sub N} transport solver in triangular-z meshes, fine deletion analysis and multi-channel thermal-hydraulics analysis are coupled in the code. • Numerical results indicate that the code is reliable and efficient for design studies of ADSRs. - Abstract: Accelerator driven subcritical reactors (ADSRs) have been proposed and widely investigated for the transmutation of transuranics (TRUs). ADSRs have several special characteristics, such as the subcritical core driven by spallation neutrons, anisotropic neutron flux distribution and complex geometry etc. These bring up requirements for development or extension of analysis codes to perform design studies. A code system named LAVENDER has been developed in this paper. It couples the modules for spallation target simulation and subcritical core analysis. The neutron transport-depletion calculation scheme is used based on the homogenized cross section from assembly calculations. A three-dimensional S{sub N} nodal transport code based on triangular-z meshes is employed and a multi-channel thermal-hydraulics analysis model is integrated. In the depletion calculation, the evolution of isotopic composition in the core is evaluated using the transmutation trajectory analysis algorithm (TTA) and fine depletion chains. The new code is verified by several benchmarks and code-to-code comparisons. Numerical results indicate that LAVENDER is reliable and efficient to be applied for the steady-state analysis and reactor core design of ADSRs.

  10. Program WALKMAN: A code designed to perform electron single collision elastic scattering Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, D.E.

    1994-08-01

    The computer code WALKMAN performs electron single collision elastic scattering Monte Carlo calculations in spherical or planar geometry. It is intended as a research tool to obtain results that can be compared to the results of condensed history calculations. This code is designed to be self documenting, in the sense that the latest documentation is included as comment lines at the beginning of the code. Printed documentation, such as this document, is periodically published and consists mostly of a copy of the comment lines from the code. The user should be aware that the comment lines within the code are continually updated to reflect the most recent status of the code and these comments should always be considered to be the most recent documentation for the code and may supersede published documentation, such as this document. Therefore, the user is advised to always read the documentation within the actual code. The remainder of this report consists of example results and a listing of the documentation which appears at the beginning of the code.

  11. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    Science.gov (United States)

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  12. TEMP: a computer code to calculate fuel pin temperatures during a transient. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Bard, F E; Christensen, B Y; Gneiting, B C

    1980-04-01

    The computer code TEMP calculates fuel pin temperatures during a transient. It was developed to accommodate temperature calculations in any system of axi-symmetric concentric cylinders. When used to calculate fuel pin temperatures, the code will handle a fuel pin as simple as a solid cylinder or as complex as a central void surrounded by fuel that is broken into three regions by two circumferential cracks. Any fuel situation between these two extremes can be analyzed along with additional cladding, heat sink, coolant or capsule regions surrounding the fuel. The one-region version of the code accurately calculates the solution to two problems having closed-form solutions. The code uses an implicit method, an explicit method and a Crank-Nicolson (implicit-explicit) method.

  13. Initial Results from a Cartesian Three-Dimensional Parabolic Equation Acoustical Propagation Code

    Science.gov (United States)

    2006-12-01

    run successfully. The code is written in the MATLAB language and runs in the MATLAB environment. The code has been implemented in two versions...originates from 3D propagation codes used for optical studies [3,4]. A trial MATLAB ®* version was written by John Colosi, from which this code was derived...100 0 100 200 300 400 500 E 5 kml two 19-m wav -. phM.23 Figure 6. The field resulting from propagation through internal waves of 10-rn amplitude and 95

  14. Cat Codes with Optimal Decoherence Suppression for a Lossy Bosonic Channel

    Science.gov (United States)

    Li, Linshu; Zou, Chang-Ling; Albert, Victor V.; Muralidharan, Sreraman; Girvin, S. M.; Jiang, Liang

    2017-07-01

    We investigate cat codes that can correct multiple excitation losses and identify two types of logical errors: bit-flip errors due to excessive excitation loss and dephasing errors due to quantum backaction from the environment. We show that selected choices of logical subspace and coherent amplitude significantly reduce dephasing errors. The trade-off between the two major errors enables optimized performance of cat codes in terms of minimized decoherence. With high coupling efficiency, we show that one-way quantum repeaters with cat codes feature a boosted secure communication rate per mode when compared to conventional encoding schemes, showcasing the promising potential of quantum information processing with continuous variable quantum codes.

  15. The TOUGH codes - a family of simulation tools for multiphase flowand transport processes in permeable media

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, Karsten

    2003-08-08

    Numerical simulation has become a widely practiced andaccepted technique for studying flow and transport processes in thevadose zone and other subsurface flow systems. This article discusses asuite of codes, developed primarily at Lawrence Berkeley NationalLaboratory (LBNL), with the capability to model multiphase flows withphase change. We summarize history and goals in the development of theTOUGH codes, and present the governing equations for multiphase,multicomponent flow. Special emphasis is given to space discretization bymeans of integral finite differences (IFD). Issues of code implementationand architecture are addressed, as well as code applications,maintenance, and future developments.

  16. PEBBLES: A COMPUTER CODE FOR MODELING PACKING, FLOW AND RECIRCULATIONOF PEBBLES IN A PEBBLE BED REACTOR

    Energy Technology Data Exchange (ETDEWEB)

    Joshua J. Cogliati; Abderrafi M. Ougouag

    2006-10-01

    A comprehensive, high fidelity model for pebble flow has been developed and embodied in the PEBBLES computer code. In this paper, a description of the physical artifacts included in the model is presented and some results from using the computer code for predicting the features of pebble flow and packing in a realistic pebble bed reactor design are shown. The sensitivity of models to various physical parameters is also discussed.

  17. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p educational and workflow interventions, which improved resident coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by

  18. An upper bound on the number of errors corrected by a convolutional code

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2000-01-01

    The number of errors that a convolutional codes can correct in a segment of the encoded sequence is upper bounded by the number of distinct syndrome sequences of the relevant length.......The number of errors that a convolutional codes can correct in a segment of the encoded sequence is upper bounded by the number of distinct syndrome sequences of the relevant length....

  19. A HARDWARE IMPLEMENTATION OF PUNCTURED CONVOLUTIONAL CODES TO COMPLETE A VITERBI DECODER CORE

    OpenAIRE

    E. García,; Torres, D.; Guzmán, M.

    2005-01-01

    This paper presents a VLSI (Very Large Scale Integration) implementation of high punctured convolutional codes.We present a new circuit architecture that is capable of processing up to 10 convolutional codes rate (n-1)/n withthe constraint length-7 derived by the puncturing technique from the basic rate-1/2. The present circuit wasdesigned in order to complete an existing Viterbi decoder core, adding some extra functionality such as aconvolutional encoder, differential encoder/decoder, punctu...

  20. Code-Switching and Competition: An Examination of a Situational Response

    Science.gov (United States)

    Bernstein, Eve; Herman, Ariela

    2014-01-01

    Code switching is primarily a linguistic term that refers to the use of two or more languages within the same conversation, or same sentence, to convey a single message. One field of linguistics, sociocultural linguistics, is broad and interdisciplinary, a mixture of language, culture, and society. In sociocultural linguistics, the code, or…

  1. A Novel Code System for Revealing Sources of Students' Difficulties with Stoichiometry

    Science.gov (United States)

    Gulacar, Ozcan; Overton, Tina L.; Bowman, Charles R.; Fynewever, Herb

    2013-01-01

    A coding scheme is presented and used to evaluate solutions of seventeen students working on twenty five stoichiometry problems in a think-aloud protocol. The stoichiometry problems are evaluated as a series of sub-problems (e.g., empirical formulas, mass percent, or balancing chemical equations), and the coding scheme was used to categorize each…

  2. A four-column theory for the origin of the genetic code: tracing the evolutionary pathways that gave rise to an optimized code

    Directory of Open Access Journals (Sweden)

    Higgs Paul G

    2009-04-01

    Full Text Available Abstract Background The arrangement of the amino acids in the genetic code is such that neighbouring codons are assigned to amino acids with similar physical properties. Hence, the effects of translational error are minimized with respect to randomly reshuffled codes. Further inspection reveals that it is amino acids in the same column of the code (i.e. same second base that are similar, whereas those in the same row show no particular similarity. We propose a 'four-column' theory for the origin of the code that explains how the action of selection during the build-up of the code leads to a final code that has the observed properties. Results The theory makes the following propositions. (i The earliest amino acids in the code were those that are easiest to synthesize non-biologically, namely Gly, Ala, Asp, Glu and Val. (ii These amino acids are assigned to codons with G at first position. Therefore the first code may have used only these codons. (iii The code rapidly developed into a four-column code where all codons in the same column coded for the same amino acid: NUN = Val, NCN = Ala, NAN = Asp and/or Glu, and NGN = Gly. (iv Later amino acids were added sequentially to the code by a process of subdivision of codon blocks in which a subset of the codons assigned to an early amino acid were reassigned to a later amino acid. (v Later amino acids were added into positions formerly occupied by amino acids with similar properties because this can occur with minimal disruption to the proteins already encoded by the earlier code. As a result, the properties of the amino acids in the final code retain a four-column pattern that is a relic of the earliest stages of code evolution. Conclusion The driving force during this process is not the minimization of translational error, but positive selection for the increased diversity and functionality of the proteins that can be made with a larger amino acid alphabet. Nevertheless, the code that results is one

  3. Development of a thermal-hydraulic code for reflood analysis in a PWR experimental loop

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Sabrina P.; Mesquita, Amir Z.; Rezende, Hugo C., E-mail: sabrinapral@gmail.com, E-mail: amir@cdtn.brm, E-mail: hcr@cdtn.br, E-mail: hcr@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Palma, Daniel A.P., E-mail: dapalma@cnen.gov.br [Comissão Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    A process of fundamental importance in the event of Loss of Coolant Accident (LOCA) in Pressurized Water nuclear Reactors (PWR) is the reflood of the core or rewetting of nuclear fuels. The Nuclear Technology Development Center (CDTN) has been developing since the 70’s programs to allow Brazil to become independent in the field of reactor safety analysis. To that end, in the 80’s was designed, assembled and commissioned one Rewetting Test Facility (ITR in Portuguese). This facility aims to investigate the phenomena involved in the thermal hydraulic reflood phase of a Loss of Coolant Accident in a PWR nuclear reactor. This work aim is the analysis of physical and mathematical models governing the rewetting phenomenon, and the development a thermo-hydraulic simulation code of a representative experimental circuit of the PWR reactors core cooling channels. It was possible to elaborate and develop a code called REWET. The results obtained with REWET were compared with the experimental results of the ITR, and with the results of the Hydroflut code, that was the old program previously used. An analysis was made of the evolution of the wall temperature of the test section as well as the evolution of the front for two typical tests using the two codes calculation, and experimental results. The result simulated by REWET code for the rewetting time also came closer to the experimental results more than those calculated by Hydroflut code. (author)

  4. ITP.FOR: A code to calculate thermal transients in High Level Waste Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Kielpinski, A.L.

    1992-10-01

    A variety of processing operations for high level radioactive waste occur in the High Level Waste Tanks in the H-Area of the Savannah River Site. Thermal design constraints exist on these processes, principally to limit the amount of corrosion inhibitor which must be added to protect the tank and cooling coil materials. The required amount of corrosion inhibitor, which must subsequently be removed prior to trapping the waste in borosilicate glass, increases exponentially with temperature over a fairly narrow range (some tens of degrees Celsius). For this reason, there is a need to model the thermal-hydraulic processes occurring in the waste tanks. A FORTRAN computer code, called ITP.FOR, was written to provide a simple but reasonably accurate analysis tool for plant operation design. The code was specifically written to model Tank 48, in which the In-Tank Precipitation (ITP) process of precipitating radioactive cesium will be initiated. Although the ITP.FOR code was written as personal-use software for scoping design calculations for Tank 48, the current intent is to extend the code`s applicability to other H-Area waste tanks, and to certify the code in accordance with the NRTSC Quality Assurance requirements for critical-use software (1Q-34, 1991). Since the code`s capabilities have generated some interest to date, the present report is presented as interim documentation of the code`s mathematical models. This documentation will eventually be supplanted by the formal documentation of the expanded and benchmarked code.

  5. New Class of Quantum Error-Correcting Codes for a Bosonic Mode

    Directory of Open Access Journals (Sweden)

    Marios H. Michael

    2016-07-01

    Full Text Available We construct a new class of quantum error-correcting codes for a bosonic mode, which are advantageous for applications in quantum memories, communication, and scalable computation. These “binomial quantum codes” are formed from a finite superposition of Fock states weighted with binomial coefficients. The binomial codes can exactly correct errors that are polynomial up to a specific degree in bosonic creation and annihilation operators, including amplitude damping and displacement noise as well as boson addition and dephasing errors. For realistic continuous-time dissipative evolution, the codes can perform approximate quantum error correction to any given order in the time step between error detection measurements. We present an explicit approximate quantum error recovery operation based on projective measurements and unitary operations. The binomial codes are tailored for detecting boson loss and gain errors by means of measurements of the generalized number parity. We discuss optimization of the binomial codes and demonstrate that by relaxing the parity structure, codes with even lower unrecoverable error rates can be achieved. The binomial codes are related to existing two-mode bosonic codes, but offer the advantage of requiring only a single bosonic mode to correct amplitude damping as well as the ability to correct other errors. Our codes are similar in spirit to “cat codes” based on superpositions of the coherent states but offer several advantages such as smaller mean boson number, exact rather than approximate orthonormality of the code words, and an explicit unitary operation for repumping energy into the bosonic mode. The binomial quantum codes are realizable with current superconducting circuit technology, and they should prove useful in other quantum technologies, including bosonic quantum memories, photonic quantum communication, and optical-to-microwave up- and down-conversion.

  6. A New Phenomenon in Saudi Females’ Code-switching: A Morphemic Analysis

    Directory of Open Access Journals (Sweden)

    Mona O. Turjoman

    2016-12-01

    Full Text Available This sociolinguistics study investigates a new phenomenon that has recently surfaced in the field of code-switching among Saudi females residing in the Western region of Saudi Arabia. This phenomenon basically combines bound Arabic pronouns, tense markers or definite article to English free morphemes or the combination of bound English affixes to Arabic morphemes. Moreover, the study examines the factors that affect this type of code-switching. The results of the study indicate that this phenomenon provides data that invalidates Poplack’s (1980 universality of the ‘Free Morpheme Constraint’. It is also concluded that the main factors that influence this type of code-switching is solidarity and group identity among other factors. Keywords: Code-switching, Saudi females, sociolinguistics, CS factors, morphemic analysis

  7. A hybrid N-body code incorporating algorithmic regularization and post-Newtonian forces

    NARCIS (Netherlands)

    Harfst, S.; Gualandris, A.; Merritt, D.; Mikkola, S.

    2008-01-01

    We describe a novel N-body code designed for simulations of the central regions of galaxies containing massive black holes. The code incorporates Mikkola's 'algorithmic' chain regularization scheme including post-Newtonian terms up to PN2.5 order. Stars moving beyond the chain are advanced using a

  8. Minimizing The Completion Time Of A Wireless Cooperative Network Using Network Coding

    DEFF Research Database (Denmark)

    Roetter, Daniel Enrique Lucani; Khamfroush, Hana; Barros, João

    2013-01-01

    network coding solution in terms of completion time, outperforms broadcasting with network coding by a factor of 2.13 and outperforms forwarding mechanisms by a factor of 6.1. Beyond computing the optimal completion time, we identify the critical decision policies derived from the MDP solution....

  9. Implementation of the critical points model in a SFM-FDTD code working in oblique incidence

    Energy Technology Data Exchange (ETDEWEB)

    Hamidi, M; Belkhir, A; Lamrous, O [Laboratoire de Physique et Chimie Quantique, Universite Mouloud Mammeri, Tizi-Ouzou (Algeria); Baida, F I, E-mail: omarlamrous@mail.ummto.dz [Departement d' Optique P.M. Duffieux, Institut FEMTO-ST UMR 6174 CNRS Universite de Franche-Comte, 25030 Besancon Cedex (France)

    2011-06-22

    We describe the implementation of the critical points model in a finite-difference-time-domain code working in oblique incidence and dealing with dispersive media through the split field method. Some tests are presented to validate our code in addition to an application devoted to plasmon resonance of a gold nanoparticles grating.

  10. A Construction of Multisender Authentication Codes with Sequential Model from Symplectic Geometry over Finite Fields

    Directory of Open Access Journals (Sweden)

    Shangdi Chen

    2014-01-01

    Full Text Available Multisender authentication codes allow a group of senders to construct an authenticated message for a receiver such that the receiver can verify authenticity of the received message. In this paper, we construct multisender authentication codes with sequential model from symplectic geometry over finite fields, and the parameters and the maximum probabilities of deceptions are also calculated.

  11. WYSIWIB: A Declarative Approach to Finding Protocols and Bugs in Linux Code

    DEFF Research Database (Denmark)

    Lawall, Julia Laetitia; Brunel, Julien Pierre Manuel; Hansen, Rene Rydhof

    2008-01-01

    Although a number of approaches to finding bugs in systems code have been proposed, bugs still remain to be found. Current approaches have emphasized scalability more than usability, and as a result it is difficult to relate the results to particular patterns found in the source code and to contr...

  12. "BLAST": A compilation of codes for the numerical simulation of the gas dynamics of explosions

    NARCIS (Netherlands)

    Berg, A.C. van den

    2009-01-01

    The availability of powerful computers these days increasingly enables the use of CFD for the numerical simulation of explosion phenomena. The BLAST software consists of a compilation of codes for the numerical simulation of the gas dynamics of explosions. Each individual code has been tailored to a

  13. WYSIWYB: A Declarative Approach to Finding API Protocols and Bugs in Linux Code

    DEFF Research Database (Denmark)

    Lawall, Julia; Lawall, Julia; Palix, Nicolas

    2009-01-01

    Although a number of approaches to finding bugs in systems code have been proposed, bugs still remain to be found. Current approaches have emphasized scalability more than usability, and as a result it is difficult to relate the results to particular patterns found in the source code and to contr...

  14. A fast code for channel limb radiances with gas absorption and scattering in a spherical atmosphere

    Science.gov (United States)

    Eluszkiewicz, Janusz; Uymin, Gennady; Flittner, David; Cady-Pereira, Karen; Mlawer, Eli; Henderson, John; Moncet, Jean-Luc; Nehrkorn, Thomas; Wolff, Michael

    2017-05-01

    We present a radiative transfer code capable of accurately and rapidly computing channel limb radiances in the presence of gaseous absorption and scattering in a spherical atmosphere. The code has been prototyped for the Mars Climate Sounder measuring limb radiances in the thermal part of the spectrum (200-900 cm-1) where absorption by carbon dioxide and water vapor and absorption and scattering by dust and water ice particles are important. The code relies on three main components: 1) The Gauss Seidel Spherical Radiative Transfer Model (GSSRTM) for scattering, 2) The Planetary Line-By-Line Radiative Transfer Model (P-LBLRTM) for gas opacity, and 3) The Optimal Spectral Sampling (OSS) for selecting a limited number of spectral points to simulate channel radiances and thus achieving a substantial increase in speed. The accuracy of the code has been evaluated against brute-force line-by-line calculations performed on the NASA Pleiades supercomputer, with satisfactory results. Additional improvements in both accuracy and speed are attainable through incremental changes to the basic approach presented in this paper, which would further support the use of this code for real-time retrievals and data assimilation. Both newly developed codes, GSSRTM/OSS for MCS and P-LBLRTM, are available for additional testing and user feedback.

  15. ICF-CY code set for infants with early delay and disabilities (EDD Code Set) for interdisciplinary assessment: a global experts survey.

    Science.gov (United States)

    Pan, Yi-Ling; Hwang, Ai-Wen; Simeonsson, Rune J; Lu, Lu; Liao, Hua-Fang

    2015-01-01

    Comprehensive description of functioning is important in providing early intervention services for infants with developmental delay/disabilities (DD). A code set of the International Classification of Functioning, Disability and Health: Children and Youth Version (ICF-CY) could facilitate the practical use of the ICF-CY in team evaluation. The purpose of this study was to derive an ICF-CY code set for infants under three years of age with early delay and disabilities (EDD Code Set) for initial team evaluation. The EDD Code Set based on the ICF-CY was developed on the basis of a Delphi survey of international professionals experienced in implementing the ICF-CY and professionals in early intervention service system in Taiwan. Twenty-five professionals completed the Delphi survey. A total of 82 ICF-CY second-level categories were identified for the EDD Code Set, including 28 categories from the domain Activities and Participation, 29 from body functions, 10 from body structures and 15 from environmental factors. The EDD Code Set of 82 ICF-CY categories could be useful in multidisciplinary team evaluations to describe functioning of infants younger than three years of age with DD, in a holistic manner. Future validation of the EDD Code Set and examination of its clinical utility are needed. The EDD Code Set with 82 essential ICF-CY categories could be useful in the initial team evaluation as a common language to describe functioning of infants less than three years of age with developmental delay/disabilities, with a more holistic view. The EDD Code Set including essential categories in activities and participation, body functions, body structures and environmental factors could be used to create a functional profile for each infant with special needs and to clarify the interaction of child and environment accounting for the child's functioning.

  16. Measuring the implementation of codes of conduct. An assessment method based on a process approach of the responsible organisation

    NARCIS (Netherlands)

    Nijhof, A.H.J.; Cludts, Stephan; Fisscher, O.A.M.; Laan, Albertus

    2003-01-01

    More and more organisations formulate a code of conduct in order to stimulate responsible behaviour among their members. Much time and energy is usually spent fixing the content of the code but many organisations get stuck in the challenge of implementing and maintaining the code. The code then

  17. The Development of a Portable Hard Disk Encryption/Decryption System with a MEMS Coded Lock

    Directory of Open Access Journals (Sweden)

    Shengyong Li

    2009-11-01

    Full Text Available In this paper, a novel portable hard-disk encryption/decryption system with a MEMS coded lock is presented, which can authenticate the user and provide the key for the AES encryption/decryption module. The portable hard-disk encryption/decryption system is composed of the authentication module, the USB portable hard-disk interface card, the ATA protocol command decoder module, the data encryption/decryption module, the cipher key management module, the MEMS coded lock controlling circuit module, the MEMS coded lock and the hard disk. The ATA protocol circuit, the MEMS control circuit and AES encryption/decryption circuit are designed and realized by FPGA(Field Programmable Gate Array. The MEMS coded lock with two couplers and two groups of counter-meshing-gears (CMGs are fabricated by a LIGA-like process and precision engineering method. The whole prototype was fabricated and tested. The test results show that the user’s password could be correctly discriminated by the MEMS coded lock, and the AES encryption module could get the key from the MEMS coded lock. Moreover, the data in the hard-disk could be encrypted or decrypted, and the read-write speed of the dataflow could reach 17 MB/s in Ultra DMA mode.

  18. The Development of a Portable Hard Disk Encryption/Decryption System with a MEMS Coded Lock.

    Science.gov (United States)

    Zhang, Weiping; Chen, Wenyuan; Tang, Jian; Xu, Peng; Li, Yibin; Li, Shengyong

    2009-01-01

    In this paper, a novel portable hard-disk encryption/decryption system with a MEMS coded lock is presented, which can authenticate the user and provide the key for the AES encryption/decryption module. The portable hard-disk encryption/decryption system is composed of the authentication module, the USB portable hard-disk interface card, the ATA protocol command decoder module, the data encryption/decryption module, the cipher key management module, the MEMS coded lock controlling circuit module, the MEMS coded lock and the hard disk. The ATA protocol circuit, the MEMS control circuit and AES encryption/decryption circuit are designed and realized by FPGA(Field Programmable Gate Array). The MEMS coded lock with two couplers and two groups of counter-meshing-gears (CMGs) are fabricated by a LIGA-like process and precision engineering method. The whole prototype was fabricated and tested. The test results show that the user's password could be correctly discriminated by the MEMS coded lock, and the AES encryption module could get the key from the MEMS coded lock. Moreover, the data in the hard-disk could be encrypted or decrypted, and the read-write speed of the dataflow could reach 17 MB/s in Ultra DMA mode.

  19. DEMOCRITUS code: A kinetic approach to the simulation of complex plasmas

    Science.gov (United States)

    Arinaminpat, Nimlan; Fichtl, Chris; Patacchini, Leonardo; Lapenta, Giovanni; Delzanno, Gian Luca

    2006-10-01

    The DEMOCRITUS code is a particle-based code for plasma-material interaction simulation. The code makes use of particle in cell (PIC) methods to simulate each plasma species, the material, and their interaction. In this study, we concentrate on a dust particle immersed in a plasma. We start with the simplest case, in which the dust particle is not allowed to emit. From here, we expand the DEMOCRITUS code to include thermionic and photo emission algorithms and obtain our data. Next we expand the physics processes present to include the presence of magnetic fields and collisional processes with a neutral gas. Finally we describe new improvements of the code including a new mover that allows for particle subcycling and a new grid adaptation approach.

  20. A toolchain for the automatic generation of computer codes for correlated wavefunction calculations.

    Science.gov (United States)

    Krupička, Martin; Sivalingam, Kantharuban; Huntington, Lee; Auer, Alexander A; Neese, Frank

    2017-06-05

    In this work, the automated generator environment for ORCA (ORCA-AGE) is described. It is a powerful toolchain for the automatic implementation of wavefunction-based quantum chemical methods. ORCA-AGE consists of three main modules: (1) generation of "raw" equations from a second quantized Ansatz for the wavefunction, (2) factorization and optimization of equations, and (3) generation of actual computer code. We generate code for the ORCA package, making use of the powerful functionality for wavefunction-based correlation calculations that is already present in the code. The equation generation makes use of the most elementary commutation relations and hence is extremely general. Consequently, code can be generated for single reference as well as multireference approaches and spin-independent as well as spin-dependent operators. The performance of the generated code is demonstrated through comparison with efficient hand-optimized code for some well-understood standard configuration interaction and coupled cluster methods. In general, the speed of the generated code is no more than 30% slower than the hand-optimized code, thus allowing for routine application of canonical ab initio methods to molecules with about 500-1000 basis functions. Using the toolchain, complicated methods, especially those surpassing human ability for handling complexity, can be efficiently and reliably implemented in very short times. This enables the developer to shift the attention from debugging code to the physical content of the chosen wavefunction Ansatz. Automatic code generation also has the desirable property that any improvement in the toolchain immediately applies to all generated code. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage Scheme

    Directory of Open Access Journals (Sweden)

    Winch Peter J

    2011-04-01

    Full Text Available Abstract Background In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Methods Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large, location (urban/rural, and type (public/private. Results Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1 Discharge Summarization, 2 Completeness Checking, 3 Diagnosis and Procedure Coding, 4 Code Checking, 5 Relative Weight Challenging, 6 Coding Report, and 7 Internal Audit. The hospital coding practice can be affected by at least five main factors: 1 Internal Dynamics, 2 Management Context, 3 Financial Dependency, 4 Resource and Capacity, and 5 External Factors. Conclusions Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.

  2. A Design Method of Four-phase-coded OFDM Radar Signal Based on Bernoulli Chaos

    Directory of Open Access Journals (Sweden)

    Huo Kai

    2016-08-01

    Full Text Available Orthogonal Frequency-Division Multiplexing (OFDM radar is receiving increasing attention in the radar field in recent years and is showing excellent performance. However, for practical applications, there are several problems with phase-coded OFDM radar, such as the existence of few good codes, limited length capability, and a high Peak-to-Mean-Envelope Power Ratio (PMEPR. To address those problems, in this paper, we propose a design method for a four-phase-coded OFDM radar signal based on Bernoulli chaos, which can construct codes of arbitrary amounts and lengths and demonstrate more agility and flexibility. By adopting original phase weighting, this method can obtain a chaotic four-phase-coded OFDM signal with a PMEPR less than two. This signal has excellent performance with respect to high resolution and Doppler radar application.

  3. How to write application code even a security auditor could love

    Energy Technology Data Exchange (ETDEWEB)

    Barlich, G.L.

    1989-01-01

    In the past the application programmer was frequently isolated from the computer security professional. The target machine might have various access controls and security plans, but when the programmer delivered a new application, it was rarely scrutinized from a security standpoint. Security reviews of application code are now being used to overcome this apparent oversight, but these reviews are often hampered by a lack of knowledge among programmers of techniques that make code secure and facilitate security analysis of the code. This paper informally describes fifteen general principles for producing good code that is easily reviewed. This paper is not a formal guideline, but is intended as an inside view of how one reviewer looks at code from a security standpoint.

  4. Inclusion of pressure and flow in a new 3D MHD equilibrium code

    Science.gov (United States)

    Raburn, Daniel; Fukuyama, Atsushi

    2012-10-01

    Flow and nonsymmetric effects can play a large role in plasma equilibria and energy confinement. A concept for such a 3D equilibrium code was developed and presented in 2011. The code is called the Kyoto ITerative Equilibrium Solver (KITES) [1], and the concept is based largely on the PIES code [2]. More recently, the work-in-progress KITES code was used to calculate force-free equilibria. Here, progress and results on the inclusion of pressure and flow in the code are presented. [4pt] [1] Daniel Raburn and Atsushi Fukuyama, Plasma and Fusion Research: Regular Articles, 7:240381 (2012).[0pt] [2] H. S. Greenside, A. H. Reiman, and A. Salas, J. Comput. Phys, 81(1):102-136 (1989).

  5. Complex phylogenetic distribution of a non-canonical genetic code in green algae

    Directory of Open Access Journals (Sweden)

    Keeling Patrick J

    2010-10-01

    Full Text Available Abstract Background A non-canonical nuclear genetic code, in which TAG and TAA have been reassigned from stop codons to glutamine, has evolved independently in several eukaryotic lineages, including the ulvophycean green algal orders Dasycladales and Cladophorales. To study the phylogenetic distribution of the standard and non-canonical genetic codes, we generated sequence data of a representative set of ulvophycean green algae and used a robust green algal phylogeny to evaluate different evolutionary scenarios that may account for the origin of the non-canonical code. Results This study demonstrates that the Dasycladales and Cladophorales share this alternative genetic code with the related order Trentepohliales and the genus Blastophysa, but not with the Bryopsidales, which is sister to the Dasycladales. This complex phylogenetic distribution whereby all but one representative of a single natural lineage possesses an identical deviant genetic code is unique. Conclusions We compare different evolutionary scenarios for the complex phylogenetic distribution of this non-canonical genetic code. A single transition to the non-canonical code followed by a reversal to the canonical code in the Bryopsidales is highly improbable due to the profound genetic changes that coincide with codon reassignment. Multiple independent gains of the non-canonical code, as hypothesized for ciliates, are also unlikely because the same deviant code has evolved in all lineages. Instead we favor a stepwise acquisition model, congruent with the ambiguous intermediate model, whereby the non-canonical code observed in these green algal orders has a single origin. We suggest that the final steps from an ambiguous intermediate situation to a non-canonical code have been completed in the Trentepohliales, Dasycladales, Cladophorales and Blastophysa but not in the Bryopsidales. We hypothesize that in the latter lineage an initial stage characterized by translational ambiguity was

  6. TreePM: A Code for Cosmological N-Body Simulations

    Indian Academy of Sciences (India)

    We describe the TreePM method for carrying out large N-Body simulations to study formation and evolution of the large scale structure in the Universe. This method is a combination of Barnes and Hut tree code and Particle-Mesh code. It combines the automatic inclusion of periodic boundary conditions of PM simulations ...

  7. Lightweight Detection of Android-specific Code Smells : The aDoctor Project

    NARCIS (Netherlands)

    Palomba, F.; Di Nucci, D.; Panichella, Annibale; Zaidman, A.E.; De Lucia, Andrea; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Code smells are symptoms of poor design solutions applied by programmers during the development of software systems. While the research community devoted a lot of effort to studying and devising approaches for detecting the traditional code smells defined by Fowler, little knowledge and support

  8. BGRID: A block-structured grid generation code for wing sections

    Science.gov (United States)

    Chen, H. C.; Lee, K. D.

    1981-01-01

    The operation of the BGRID computer program is described for generating block-structured grids. Examples are provided to illustrate the code input and output. The application of a fully implicit AF (approximation factorization)-based computer code, called TWINGB (Transonic WING), for solving the 3D transonic full potential equation in conservation form on block-structured grids is also discussed.

  9. Implementing the Netherlands Code of Conduct for Scientific Practice : A Case Study

    NARCIS (Netherlands)

    Schuurbiers, D.; Osseweijer, P.; Kinderlerer, J.

    2009-01-01

    Widespread enthusiasm for establishing scientific codes of conduct notwithstanding, the utility of such codes in influencing scientific practice is not self-evident. It largely depends on the implementation phase following their establishment—a phase which often receives little attention. The aim of

  10. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  11. Proposing a Web-Based Tutorial System to Teach Malay Language Braille Code to the Sighted

    Science.gov (United States)

    Wah, Lee Lay; Keong, Foo Kok

    2010-01-01

    The "e-KodBrailleBM Tutorial System" is a web-based tutorial system which is specially designed to teach, facilitate and support the learning of Malay Language Braille Code to individuals who are sighted. The targeted group includes special education teachers, pre-service teachers, and parents. Learning Braille code involves memorisation…

  12. Arguments in favour and against a code of ethics for historians

    NARCIS (Netherlands)

    De Baets, A.

    2005-01-01

    Traditional attitudes of historians towards professional ethics are laconic. Should they encourage the drafting of codes that explicitly detail ethical principles? I look into the advantages and disadvantages of such professional codes. A first question that arises is whether allied branches of

  13. Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project

    Science.gov (United States)

    Bolstad, Rachel

    2016-01-01

    This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…

  14. EMPIRE-II code-system with RIPL database as a tool for nuclear spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Sin, Mihaela E-mail: msin@pcnet.ro

    2004-07-21

    The paper presents the modular system of nuclear reaction codes EMPIRE-II in conjunction with RIPL-2 database as a valuable tool in the spin-parity assignment for discrete levels of residual nuclei populated in reactions evolving through the compound nucleus mechanism. Several applications illustrating the method and code's predictive power are presented.

  15. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of

  16. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Energy Technology Data Exchange (ETDEWEB)

    Aeschliman, D.P.; Oberkampf, W.L.; Blottner, F.G.

    1995-07-01

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  17. A QR code identification technology in package auto-sorting system

    Science.gov (United States)

    di, Yi-Juan; Shi, Jian-Ping; Mao, Guo-Yong

    2017-07-01

    Traditional manual sorting operation is not suitable for the development of Chinese logistics. For better sorting packages, a QR code recognition technology is proposed to identify the QR code label on the packages in package auto-sorting system. The experimental results compared with other algorithms in literatures demonstrate that the proposed method is valid and its performance is superior to other algorithms.

  18. A Preliminary Study of Teacher Code-Switching in Secondary English and Science in Malaysia

    Science.gov (United States)

    Then, David Chen-On; Ting, Su-Hie

    2009-01-01

    This study examines functions of teacher code-switching in secondary school English and science classrooms in Malaysia, where English has recently been implemented as the language of instruction for science. Classroom interaction data were obtained from two English lessons and a science lesson. Analysis of the teachers' code-switching using…

  19. Factors of Code Switching among Bilingual English Students in the University Classroom: A Survey

    Science.gov (United States)

    Bista, Krishna

    2010-01-01

    This study proposes to identify and evaluate the factors that affect code switching in the university classroom among 15 bilingual international students. The findings from the study conducted in a southern American university revealed that the primary factor of code switching in international classroom is incompetence in the second language.…

  20. The grammar of English-Afrikaans code switching : a feature checking account

    NARCIS (Netherlands)

    Dulm, Ondene van

    2007-01-01

    This dissertation focuses on structural aspects of code switching between South African English and Afrikaans. Specifically, the main aim is to investigate the merit of an account of intrasentential code switching in terms of feature checking theory, a theory associated with minimalist syntax. The

  1. How Color Coding Formulaic Writing Enhances Organization: A Qualitative Approach for Measuring Student Affect

    Science.gov (United States)

    Geigle, Bryce A.

    2014-01-01

    The aim of this thesis is to investigate and present the status of student synthesis with color coded formula writing for grade level six through twelve, and to make recommendations for educators to teach writing structure through a color coded formula system in order to increase classroom engagement and lower students' affect. The thesis first…

  2. Thermo-mechanical description of a nuclear pin, BACO code version 2.20

    Energy Technology Data Exchange (ETDEWEB)

    Marino, A.C. [Comision Nacional de Energia Atomica, San Carlos de Bariloche (Argentina). Gerencia de Area Ciclo de Combustible; Savino, E.; Harriague, S. [Comision Nacional de Energia Atomica, Buenos Aires (Argentina). Gerencia de Area Investigacion y Desarrollo

    1995-12-31

    BACO code, version 2.20 and some applications are presented. BACO (BArra COmbustible) is a code for the simulation of the thermo-mechanical and fission gas behavior of a cylindrical fuel rod under operation. The new version was developed in connection with the CRP FUMEX of the IAEA (Coordinated Research Project on Fuel Modelling at Extended Burnup). The project originated a conceptual revision of the original code. The revision includes convergence criteria, mathematical treatments and fuel behavior modelling. The code use domain is in PHWR fuel but it may be extended to other applications. The BACO code has a good performance in the range of low-intermediate burnup. We include the study of pore migration and restructuring, relocation of pellet fragments and gap heat conductance, fuel MOX rod analysis, and an example of FUMEX case. (author). 19 refs., 9 figs., 1 tab.

  3. A Convolutional Code-Based Sequence Analysis Model and Its Application

    Directory of Open Access Journals (Sweden)

    Xiaoli Geng

    2013-04-01

    Full Text Available A new approach for encoding DNA sequences as input for DNA sequence analysis is proposed using the error correction coding theory of communication engineering. The encoder was designed as a convolutional code model whose generator matrix is designed based on the degeneracy of codons, with a codon treated in the model as an informational unit. The utility of the proposed model was demonstrated through the analysis of twelve prokaryote and nine eukaryote DNA sequences having different GC contents. Distinct differences in code distances were observed near the initiation and termination sites in the open reading frame, which provided a well-regulated characterization of the DNA sequences. Clearly distinguished period-3 features appeared in the coding regions, and the characteristic average code distances of the analyzed sequences were approximately proportional to their GC contents, particularly in the selected prokaryotic organisms, presenting the potential utility as an added taxonomic characteristic for use in studying the relationships of living organisms.

  4. A sparse octree gravitational N-body code that runs entirely on the GPU processor

    Science.gov (United States)

    Bédorf, Jeroen; Gaburov, Evghenii; Portegies Zwart, Simon

    2012-04-01

    We present the implementation and performance of a new gravitational N-body tree-code that is specifically designed for the graphics processing unit (GPU).The code is publicly available at: http://castle.strw.leidenuniv.nl/software.html.1 All parts of the tree-code algorithm are executed on the GPU. We present algorithms for parallel construction and traversing of sparse octrees. These algorithms are implemented in CUDA and tested on NVIDIA GPUs, but they are portable to OpenCL and can easily be used on many-core devices from other manufacturers. This portability is achieved by using general parallel-scan and sort methods. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second.

  5. A neutron spectrum unfolding computer code based on artificial neural networks

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2014-02-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding in

  6. The potential adoption benefits and challenges of LOINC codes in a laboratory department: a case study.

    Science.gov (United States)

    Uchegbu, Chukwuemeka; Jing, Xia

    2017-12-01

    Logical Observation Identifiers Names and Codes (LOINC) are a standard for identifying and reporting laboratory investigations that were developed and are maintained by the Regenstrief Institute. LOINC codes have been adopted globally by hospitals, government agencies, laboratories, and research institutions. There are still many healthcare organizations, however, that have not adopted LOINC codes, including rural hospitals in low- and middle- income countries. Hence, organizations in these areas do not receive the benefits that accrue with the adoption of LOINC codes. We conducted a literature search by utilizing PubMed, CINAHL, Google Scholar, ACM Digital Library, and the Biomed Central database to look for existing publications on the benefits and challenges of adopting LOINC. We selected and reviewed 16 publications and then conducted a case study via the following steps: (1) we brainstormed, discussed, analyzed, created and revised iteratively the patient's clinical encounter (outpatient or ambulatory settings) process within a laboratory department via utilizing a hypothetical patient; (2) we incorporated the work experience of one of the authors (CU) in a rural hospital laboratory department in Nigeria to break down the clinical encounter process into simpler and discrete steps and created a series of use cases for the process; (3) we then analyzed and summarized the potential usage of LOINC codes (clinically, administratively, and operationally) and the benefits and challenges of adopting LOINC codes in such settings by examining the use cases one by one. Based on the literature review, we noted that LOINC codes' ability to improve laboratory results' interoperability has been recognized broadly. LOINC-coded laboratory results can improve patients' safety due to their consistent meaning as well as the related reduction of duplicate lab tests, easier assessment of workloads in the laboratory departments, and accurate auditing of laboratory accounts. Further

  7. A Hierarchical Predictive Coding Model of Object Recognition in Natural Images.

    Science.gov (United States)

    Spratling, M W

    2017-01-01

    Predictive coding has been proposed as a model of the hierarchical perceptual inference process performed in the cortex. However, results demonstrating that predictive coding is capable of performing the complex inference required to recognise objects in natural images have not previously been presented. This article proposes a hierarchical neural network based on predictive coding for performing visual object recognition. This network is applied to the tasks of categorising hand-written digits, identifying faces, and locating cars in images of street scenes. It is shown that image recognition can be performed with tolerance to position, illumination, size, partial occlusion, and within-category variation. The current results, therefore, provide the first practical demonstration that predictive coding (at least the particular implementation of predictive coding used here; the PC/BC-DIM algorithm) is capable of performing accurate visual object recognition.

  8. Newtonian CAFE: a new ideal MHD code to study the solar atmosphere

    Science.gov (United States)

    González, J. J.; Guzmán, F.

    2015-12-01

    In this work we present a new independent code designed to solve the equations of classical ideal magnetohydrodynamics (MHD) in three dimensions, submitted to a constant gravitational field. The purpose of the code centers on the analysis of solar phenomena within the photosphere-corona region. In special the code is capable to simulate the propagation of impulsively generated linear and non-linear MHD waves in the non-isothermal solar atmosphere. We present 1D and 2D standard tests to demonstrate the quality of the numerical results obtained with our code. As 3D tests we present the propagation of MHD-gravity waves and vortices in the solar atmosphere. The code is based on high-resolution shock-capturing methods, uses the HLLE flux formula combined with Minmod, MC and WENO5 reconstructors. The divergence free magnetic field constraint is controlled using the Flux Constrained Transport method.

  9. Validation of a transonic analysis code for use in preliminary design of advanced transport configurations

    Science.gov (United States)

    Waggoner, E. G.

    1984-01-01

    The WBPPW code has the capability of analyzing flow-field effects about configurations which include wing pylons and engine nacelles or pods in addition to the basic wing/fuselage combination. Using the concept of grid embedding, the code solves the extended small disturbance transonic flow equation for complex flow interactions of the various configuration components. A general description of the code and solution algorithm is included. Results are presented and compared with experiment for various configurations which encompass the code capabilities. These include wing planform and wing contour modifications and variations in nacelle position beneath a high-aspect-ratio wing. Results are analyzed in the light of preliminary design, where the capability to accurately compute flow-field effects resulting from various configuration perturbations is important. The comparisons show that the computational results are sensitive to subtle design modifications and that the code could be used as an effective guide during the design process for transport configurations.

  10. Development of a system of computer codes for severe accident analysis and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, S. H.; Chun, S. W.; Jang, H. S. and others [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1993-01-15

    As a continuing study for the development of a system of computer codes to analyze severe accidents which had been performed last year, major focuses were on the aspect of application of the developed code systems. As the first step, two most commonly used code packages other than STCP, i.e., MELCOR of NRC and MAAP of IDCOR were reviewed to compare the models that they used. Next, important heat transfer phenomena were surveyed as severe accident progressed. Particularly, debris bed coolability and molten core-concrete interaction were selected as sample models, and they were studied extensively. The recent theoretical works and experiments for these phenomena were surveyed, and also the relevant models adopted by major code packages were compared and assessed. Based on the results obtained in this study, it is expected to be able to take into account these phenomenological uncertainties when one uses the severe accident code packages for probabilistic safety assessments or accident management programs.

  11. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is propose...... including an outer BCH code correcting a few bit errors.......The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  12. A two-locus global DNA barcode for land plants: the coding rbcL gene complements the non-coding trnH-psbA spacer region.

    Science.gov (United States)

    Kress, W John; Erickson, David L

    2007-06-06

    A useful DNA barcode requires sufficient sequence variation to distinguish between species and ease of application across a broad range of taxa. Discovery of a DNA barcode for land plants has been limited by intrinsically lower rates of sequence evolution in plant genomes than that observed in animals. This low rate has complicated the trade-off in finding a locus that is universal and readily sequenced and has sufficiently high sequence divergence at the species-level. Here, a global plant DNA barcode system is evaluated by comparing universal application and degree of sequence divergence for nine putative barcode loci, including coding and non-coding regions, singly and in pairs across a phylogenetically diverse set of 48 genera (two species per genus). No single locus could discriminate among species in a pair in more than 79% of genera, whereas discrimination increased to nearly 88% when the non-coding trnH-psbA spacer was paired with one of three coding loci, including rbcL. In silico trials were conducted in which DNA sequences from GenBank were used to further evaluate the discriminatory power of a subset of these loci. These trials supported the earlier observation that trnH-psbA coupled with rbcL can correctly identify and discriminate among related species. A combination of the non-coding trnH-psbA spacer region and a portion of the coding rbcL gene is recommended as a two-locus global land plant barcode that provides the necessary universality and species discrimination.

  13. The non-coding oncogene: a case of missing DNA evidence?

    Directory of Open Access Journals (Sweden)

    Puja eShahrouki

    2012-09-01

    Full Text Available The evidence that links classical protein-coding proto-oncogenes and tumor suppressors, such as MYC, RAS, P53, and RB, to carcinogenesis is indisputable. Multiple lines of proof show how random somatic genomic alteration of such genes (e.g. mutation, deletion or amplification, followed by selection and clonal expansion, forms the main molecular basis of tumor development. Many important cancer genes were discovered using low-throughput approaches in the pre-genomic era, and this knowledge is today solidified and expanded upon by modern genome-scale methodologies. In several recent studies, non-coding RNAs (ncRNAs, such as microRNAs and long non-coding RNAs (lncRNAs, have been shown to contribute to tumor development. However, in comparison with coding cancer genes, the genomic (DNA-level evidence is sparse for ncRNAs. The coding proto-oncogenes and tumor suppressors that we know of today are major molecular hubs in both normal and malignant cells. The search for non-coding RNAs with tumor driver or suppressor roles therefore holds the additional promise of pinpointing important, biologically active, ncRNAs in a vast and largely uncharacterized non-coding transcriptome. Here, we assess the available DNA-level data that links non-coding genes to tumor development. We further consider historical, methodological and biological aspects, and discuss future prospects of ncRNAs in cancer.

  14. Numerical Simulation of Two-grid Ion Optics Using a 3D Code

    Science.gov (United States)

    Anderson, John R.; Katz, Ira; Goebel, Dan

    2004-01-01

    A three-dimensional ion optics code has been developed under NASA's Project Prometheus to model two grid ion optics systems. The code computes the flow of positive ions from the discharge chamber through the ion optics and into the beam downstream of the thruster. The rate at which beam ions interact with background neutral gas to form charge exchange ions is also computed. Charge exchange ion trajectories are computed to determine where they strike the ion optics grid surfaces and to determine the extent of sputter erosion they cause. The code has been used to compute predictions of the erosion pattern and wear rate on the NSTAR ion optics system; the code predicts the shape of the eroded pattern but overestimates the initial wear rate by about 50%. An example of use of the code to estimate the NEXIS thruster accelerator grid life is also presented.

  15. A Multiple Sphere T-Matrix Fortran Code for Use on Parallel Computer Clusters

    Science.gov (United States)

    Mackowski, D. W.; Mishchenko, M. I.

    2011-01-01

    A general-purpose Fortran-90 code for calculation of the electromagnetic scattering and absorption properties of multiple sphere clusters is described. The code can calculate the efficiency factors and scattering matrix elements of the cluster for either fixed or random orientation with respect to the incident beam and for plane wave or localized- approximation Gaussian incident fields. In addition, the code can calculate maps of the electric field both interior and exterior to the spheres.The code is written with message passing interface instructions to enable the use on distributed memory compute clusters, and for such platforms the code can make feasible the calculation of absorption, scattering, and general EM characteristics of systems containing several thousand spheres.

  16. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.

  17. Population-Level Neural Codes Are Robust to Single-Neuron Variability from a Multidimensional Coding Perspective.

    Science.gov (United States)

    Montijn, Jorrit S; Meijer, Guido T; Lansink, Carien S; Pennartz, Cyriel M A

    2016-08-30

    Sensory neurons are often tuned to particular stimulus features, but their responses to repeated presentation of the same stimulus can vary over subsequent trials. This presents a problem for understanding the functioning of the brain, because downstream neuronal populations ought to construct accurate stimulus representations, even upon singular exposure. To study how trial-by-trial fluctuations (i.e., noise) in activity influence cortical representations of sensory input, we performed chronic calcium imaging of GCaMP6-expressing populations in mouse V1. We observed that high-dimensional response correlations, i.e., dependencies in activation strength among multiple neurons, can be used to predict single-trial, single-neuron noise. These multidimensional correlations are structured such that variability in the response of single neurons is relatively harmless to population representations of visual stimuli. We propose that multidimensional coding may represent a canonical principle of cortical circuits, explaining why the apparent noisiness of neuronal responses is compatible with accurate neural representations of stimulus features. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  18. Population-Level Neural Codes Are Robust to Single-Neuron Variability from a Multidimensional Coding Perspective

    Directory of Open Access Journals (Sweden)

    Jorrit S. Montijn

    2016-08-01

    Full Text Available Sensory neurons are often tuned to particular stimulus features, but their responses to repeated presentation of the same stimulus can vary over subsequent trials. This presents a problem for understanding the functioning of the brain, because downstream neuronal populations ought to construct accurate stimulus representations, even upon singular exposure. To study how trial-by-trial fluctuations (i.e., noise in activity influence cortical representations of sensory input, we performed chronic calcium imaging of GCaMP6-expressing populations in mouse V1. We observed that high-dimensional response correlations, i.e., dependencies in activation strength among multiple neurons, can be used to predict single-trial, single-neuron noise. These multidimensional correlations are structured such that variability in the response of single neurons is relatively harmless to population representations of visual stimuli. We propose that multidimensional coding may represent a canonical principle of cortical circuits, explaining why the apparent noisiness of neuronal responses is compatible with accurate neural representations of stimulus features.

  19. Entracking as a Brain Stem Code for Pitch: The Butte Hypothesis.

    Science.gov (United States)

    Joris, Philip X

    2016-01-01

    The basic nature of pitch is much debated. A robust code for pitch exists in the auditory nerve in the form of an across-fiber pooled interspike interval (ISI) distribution, which resembles the stimulus autocorrelation. An unsolved question is how this representation can be "read out" by the brain. A new view is proposed in which a known brain-stem property plays a key role in the coding of periodicity, which I refer to as "entracking", a contraction of "entrained phase-locking". It is proposed that a scalar rather than vector code of periodicity exists by virtue of coincidence detectors that code the dominant ISI directly into spike rate through entracking. Perfect entracking means that a neuron fires one spike per stimulus-waveform repetition period, so that firing rate equals the repetition frequency. Key properties are invariance with SPL and generalization across stimuli. The main limitation in this code is the upper limit of firing (~ 500 Hz). It is proposed that entracking provides a periodicity tag which is superimposed on a tonotopic analysis: at low SPLs and fundamental frequencies > 500 Hz, a spectral or place mechanism codes for pitch. With increasing SPL the place code degrades but entracking improves and first occurs in neurons with low thresholds for the spectral components present. The prediction is that populations of entracking neurons, extended across characteristic frequency, form plateaus ("buttes") of firing rate tied to periodicity.

  20. Electrical safety code manual a plan language guide to national electrical code, OSHA and NFPA 70E

    CERN Document Server

    Keller, Kimberley

    2010-01-01

    Safety in any workplace is extremely important. In the case of the electrical industry, safety is critical and the codes and regulations which determine safe practices are both diverse and complicated. Employers, electricians, electrical system designers, inspectors, engineers and architects must comply with safety standards listed in the National Electrical Code, OSHA and NFPA 70E. Unfortunately, the publications which list these safety requirements are written in very technically advanced terms and the average person has an extremely difficult time understanding exactly what they need to