WorldWideScience

Sample records for rda code emphasizes

  1. Practical cataloguing AACR, RDA and MARC 21

    CERN Document Server

    Welsh, Anne

    2012-01-01

    Written at a time of transition in international cataloguing, this book provides cataloguers and students with a background in general cataloguing principles, the code (AACR2) and format (MARC 21) and the new standard (RDA). It provides library managers with an overview of the development of RDA in order to equip them to make the transition.

  2. The Making of RDA

    Directory of Open Access Journals (Sweden)

    Tom Delsey

    2016-05-01

    Full Text Available L'autore ripercorre lo sviluppo di RDA dal principio, nel 2005, fino alla sua prima pubblicazione, nel 2010. L'impegno di sviluppo è inserito nel contesto di un ambiente digitale in evoluzione che trasforma sia la produzione sia la diffusione delle risorse informative e delle risorse utilizzate per creare, immagazzinare e accedere ai dati che descrivono tali risorse. L'autore esamina l'interazione tra l'impegno strategico ad allineare RDA con i nuovi modelli concettuali, le strutture di database emergenti e lo sviluppo dei metadati nelle comunità alleate, da una parte, e la compatibilità con l'eredità di AACR2 e dei database esistenti dall'altra. Gli aspetti esaminati comprendono la strutturazione di RDA come linguaggio di descrizione delle risorse, l'organizzazione del nuovo standard come uno strumento di lavoro e il raffinamento delle linee guida e delle istruzioni per la registrazione dei dati secondo RDA.

  3. RDA implementation in the URBE Network

    Directory of Open Access Journals (Sweden)

    Stefano Bargioni

    2018-01-01

    Full Text Available URBE -Unione Romana Biblioteche Ecclesiastiche- is composed of 18 academic libraries and adopted RDA starting from March 2017. The process of the decision, the formation and training of cataloguers, and the modifications made to 4 different ILSs, as well as the goals URBE hopes to achieve thanks to RDA adoption, are presented. Also, the adoption of the RDA Toolkit, and the problems related to local variants, currently examined in collaboration with the Vatican Library, are discussed.

  4. Erste Erfahrungen mit RDA an wissenschaftlichen Universalbibliotheken in Deutschland - Ergebnisse aus Fokusgruppengesprächen mit Katalogisierenden

    Directory of Open Access Journals (Sweden)

    Heidrun Wiesenmüller

    2017-04-01

    Full Text Available Einige Monate nach dem Umstieg auf das neue Regelwerk "Resource Description and Access" (RDA wurden an 18 großen deutschen wissenschaftlichen Universalbibliotheken Fokusgruppengespräche mit Katalogisierererinnen und Katalogisierern durchgeführt. Die Katalogisierenden wurden u.a. befragt, wie sicher sie sich bei der Anwendung von RDA fühlen, was sie am neuen Regelwerk gut oder schlecht finden, wie sie den Aufwand im Vergleich zum früheren Regelwerk RAK einschätzen, welche Informations- und Hilfsmittel sie verwenden und wie sie zu den regelmäßigen Änderungen im Standard stehen. Der vorliegende Aufsatz dokumentiert die Ergebnisse der Gespräche.   Several months after the introduction of the new cataloging standard "Resource Description and Access" (RDA, focus-group interviews with catalogers were conducted at 18 large academic and state libraries in Germany. Among other things, the catalogers were asked how confident they feel in applying RDA, which aspects of the new cataloging code they like or do not like, how they estimate the expenditure of time in comparison to the former cataloging code RAK, which sources they use to get help or information, and what they think about the frequent changes to the new standard. The paper presents the results of these interviews.

  5. RDA Implementation at the National Autonomous University of Mexico

    Directory of Open Access Journals (Sweden)

    Filiberto Felipe Martínez Arellano

    2017-04-01

    Full Text Available The Universidad Nacional Autónoma de México (UNAM [National Autonomous University of Mexico] is the largest and most important educational institution in Mexico and Latin America. It serves more than 340,000 graduate, bachelor and high school students. It offers 41 graduate programs and 118 undergraduate ones. Moreover, the Biblioteca Nacional de México (BNM [National Library of Mexico] is into the UNAM structure, playing the role of safeguarding and preserving the bibliographical memory of Mexico, which is integrated by more than 1,250,000 books and documents that are sheltered into this library. Moreover, the Sistema Bibliotecario y de Información de la UNAM (SIBIUNAM [UNAM Library and Information System] is composed of 135 libraries with a collection of approximately 1,700.000 book titles and 6,000.000 volumes, which is coordinated by the Dirección General de Bibliotecas de la UNAM [UNAM General Directorate of Libraries]. Additionally, in the UNAM organization is placed the Instituto de Investigaciones Bibliotecológicas y de la Información (IIBI [Institute of Research in Library and Information Sciences], where theoretical and applied research projects are carried out in several research areas, being one of them the information organization, and within this one, cataloging standards are approached. This article addresses the challenges that the adoption of the new RDA cataloging standard has represented for these three UNAM organizations, as well as the actions developed in each one of them. The Biblioteca Nacional México (BNM experiences on RDA implementation are shared. Some problematic situations, as well as the solution for these ones, are presented. It is also included the BNM participation in forums, to communicate the work that was done on cataloging using the new code. It also is addressed the updating and training of staff, participation in cooperative cataloging programs and the RDA analysis in work teams, as well as the

  6. Umfrage zur RDA-Einführung in der Universitätsbibliothek Trier

    Directory of Open Access Journals (Sweden)

    Birgit Unkhoff-Giske

    2018-03-01

    Full Text Available Nach fast einem Jahr praktischer Erfahrung mit dem neuen Regelwerk wurde in der Universitätsbibliothek Trier eine Umfrage zur Einführung von „Resource Description and Access (RDA“ durchgeführt. Dabei ging es um den Katalogisierungsalltag: die Sicherheit im Umgang mit RDA und dem Toolkit, die Informationsversorgung, Änderungen beim Arbeitsaufwand, die Bewertung der RDA-Regelungen, insbesondere des neuen Prinzips „Cataloguer’s judgement“ und die persönliche Einstellung gegenüber dem Regelwerksumstieg. Die Ergebnisse der 20 Fragen werden im folgenden Beitrag vorgestellt und analysiert. Die Umfrage führte zu überraschend positiven Erkenntnissen, deckte aber auch Problemfelder auf, die der Nachbearbeitung bedürfen. After almost a year of practical experience with the new cataloguing standard, a survey on the implementation of “Resource Description and Access (RDA” was carried out at the University Library of Trier. It focused on everyday cataloguing: the confidence in dealing with RDA and the Toolkit, the supply of information, changes in the amount of work, the evaluation of the RDA rules, in particular the new principle “cataloguer’s judgement”, and the personal attitudes towards the change of the cataloguing code. The results of the 20 questions are presented and analysed in the following article. The survey led to surprisingly positive findings, but also uncovered problem areas that require follow-up work.

  7. RDA: analisi, riflessioni e attività all'ICCU

    Directory of Open Access Journals (Sweden)

    Simonetta Buttò

    2016-05-01

    Full Text Available The report aims to analyze the applicability of the Resource Description and Access (RDA within the Italian public libraries, and also in the archives and museums in order to contribute to the discussion at international level. The Central Institute for the Union Catalogue of Italian libraries (ICCU manages the online catalogue of the Italian libraries and the network of bibliographic services. ICCU has the institutional task of coordinating the cataloging and the documentation activities for the Italian libraries. On March 31 st 2014, the Institute signed the Agreement with the American Library Association,Publishing ALA, for the Italian translation rights of RDA, now available and published inRDAToolkit. The Italian translation has been carried out and realized by the Technical Working Group, made up of the main national and academic libraries, cultural Institutions and bibliographic agencies. The Group started working from the need of studying the new code in its textual detail, to better understand the principles, purposes, and applicability and finally its sustainability within the national context in relation to the area of the bibliographic control. At international level, starting from the publication of the Italian version of RDA and through the research carried out by ICCU and by the national Working Groups, the purpose is a more direct comparison with the experiences of the other European countries, also within EURIG international context, for an exchange of experiences aimed at strengthening the informational content of the data cataloging, with respect to history, cultural traditions and national identities of the different countries.

  8. RDA resource, description and access, 2013 revision

    CERN Document Server

    2013-01-01

    This full-text print version of RDA offers a snapshot that serves as an offline access point to help solo and part-time cataloguers evaluate RDA, as well as to support training and classroom use in any size institution. An index is included.

  9. Sacherschliessung nach RDA

    Directory of Open Access Journals (Sweden)

    Hans Schürmann

    2015-05-01

    Full Text Available Die Resource Description and Access (RDA wird für den deutschsprachigen Raum das neue Regelwerk für die bibliothekarische Erschliessung der Bestände. In diesem Regelwerk wird auch die Sacherschliessung neu geregelt. Zurzeit sind diese Seiten noch leer. Doch wie soll ein Regelwerk für die Sacherschliessung aussehen? Kriterien für eine Regelung fehlen, zu stark hat sich das Umfeld der Inhaltserschliessung in den letzten Jahren geändert. Auch von der Schlagworttheorie her sind noch keine Hinweise in Sicht. Deshalb wird in verschiedenen Gremien intensiv diskutiert, wie eine zukunftsfähige Sacherschliessung aussehen könnte. In welchem Rahmen bewegt sich die Diskussion und woran soll sich die Regelwerksentwicklung für den Sachkatalog orientieren? Der Beitrag erwägt zwischen der Sachkataloggeschichte einerseits und dem modernen Datenmanagement im Netz andererseits die Bedeutung und die Rolle eines Sacherschliessungsregelwerks. Resource Description and Access (RDA will be the new standard for descriptive cataloguing in the German-speaking countries. Under RDA, subject indexing will also be redefined, although, at present, the relevant pages remain blank. This article ponders the question what these new standards could look like. As yet no clear criteria have emerged – the field of subject indexing has undergone too much change in recent years for that. Nor does the theory appear to offer any answers at the moment. There is therefore intensive and ongoing debate in the various forums. What is the framework of these discussions and what are the key criteria? The article summarizes them and considers the importance of indexing standards in the light of both the history of indexing and modern data management on the web.

  10. RDA, ovvero, Il lungo viaggio del catalogo verso l'era digitale

    Directory of Open Access Journals (Sweden)

    Barbara B. Tillett

    2016-05-01

    Full Text Available RDA was created in response to complaints about the Anglo-American Cataloguing Rules, especially the call for a more international, principle-based content standard that takes the perspective of the conceptual models of FRBR (Functional Requirements for Bibliographic Records and FRAD (Functional Requirements for Authority Data.  The past and ongoing process for continuous improvement to RDA is through the Joint Steering Committee for Development of RDA (known as the JSC, but recently renamed the RDA Steering Committee - RSC to make RDA even more international and principle-based.

  11. RDA: a content standard to ensure the quality of data

    Directory of Open Access Journals (Sweden)

    Carlo Bianchini

    2016-05-01

    Full Text Available RDA Resource Description and Access are guidelines for description and access to resources designed for digital environment and released, in its first version, in 2010. RDA is based on FRBR and its derived models, that focus on users’ needs and on resources of any kind of content, medium and carrier.  The paper discusses relevance of main features of RDA for the future role of libraries in the context of semantic web and metadata creation and exchange. The paper aims to highlight many consequences deriving from RDA being a content standard, and in particular the change from record management to data management, differences among the two functions realized by RDA (to identify and to relate entities and functions realized by other standard such as MARC21 (to archive data and ISB (to visualize data and show how, as all these functions are necessary for the catalog, RDA needs to be integrated by other rules and standard and that these tools allow the fulfilment of the variation principle defined by S.R. Ranganathan.

  12. RDA in Turchia: percezioni e aspettative sull'implementazione

    Directory of Open Access Journals (Sweden)

    Doğan Atılgan

    2015-05-01

    Full Text Available Integration of user-generated content with library catalogs is a remarkable point with the developments in web technologies and semantic networks. In the light of these developments, library catalogs are linked with open data resources like VIAF, DBpedia and amazon.com with the aim of bibliographic description via URI based structures. On the other hand "Resource Description and Access" (RDA, as a new cataloging standard, supports libraries for their bibliographic description studies by increasing access points. Furthermore, many initiatives have been launched by countries who would like to keep themselves up-to-date by using and implementing RDA in their library catalogs. In this context, developing catalogers' opinions and perceptions regarding of RDA implementations is of great importance. In this study, it is aimed to reveal the requirements, awareness and perceptions of catalogers in academic libraries in Turkey about RDA implementations. The contents of this paper were originally presented at the international conference FSR 2014 "Faster, Smarter, Richer: Reshaping the library Catalogue", held in Rome, 27-28 February 2014.

  13. Roles and Delegation of Authority (R/DA) System; TOPICAL

    International Nuclear Information System (INIS)

    ABBOTT, JOHN P.; HUTCHINS, JAMES C.; SCHOCH, DAVID G.

    1999-01-01

    The processes of defining managerial roles and providing for delegation of authority are essential to any enterprise. At most large organizations, these processes are defined in policy manuals and through sets of standard operating procedures for many, if not all, business and administrative functions. Many of these staff-initiated, administrative functions require the routing of documents for approval to one or more levels of management. These employee-oriented, back office types of workflows tend to require more flexibility in determining to whom these documents should go to, while, at the same time, providing the responsible parties with the flexibility to delegate their approval authority or allow others to review their work. Although this practice is commonplace in manual, paper-based processes that exist in many organizations, it is difficult to provide the same flexibility in the more structured, electronic-based, workflow systems. The purpose of this report is to present a framework or architecture for creating a R/DA system and provide some insights associated with its design and utilization. To improve understanding and clarify subsequent discussion, the goals and requirements for the major R/DA system components, namely, the database and interface modules, are initially presented along with the identification of important concepts and the definition of critical terms. Next high-level functions relating the types of inputs to the outputs of the R/DA interface module are introduced and discussed. Then the relationships between the major R/DA modules and the primary components associated with its creation and maintenance are presented and analyzed. Finally, some conclusions are drawn relative to the advantages associated with developing a R/DA system for use in implementing an enterprise-wide, work-facilitating information system

  14. Organizing for access with FRBR, RDA, linked data, and beyond

    CERN Document Server

    Hsieh-Yee, Ingrid P

    2018-01-01

    "Organizing for Access with FRBR, RDA, Linked Data, and Beyond" provides a current, insightful discussion of the new opportunities and challenges posed by the management of big datasets. It explains how to apply Functional Requirements for Bibliographic Records (FRBR) and (Resource Description and Access) RDA to create cataloging records for various types of resources--skills that will help systems librarians, reference librarians, and digital resources librarians to transform their professional practices and give them an invaluable competitive edge. This book is ideal as a primary textbook for a range of LIS courses, such as courses on cataloging and/or managing audiovisual collections and continuing education workshops on cataloging audiovisual materials (including music or film collections). It presents a critical assessment of the advantages and limitations of using RDA for managing information resources and data, explains how to increase the benefit and relevancy of library data and records, and discuss...

  15. Using the Resource Description and Access (RDA in the creation of persons, families and corporate bodies authority records

    Directory of Open Access Journals (Sweden)

    Fabrício Silva Assumpção

    2013-08-01

    Full Text Available Considering the development of Resource Description and Access (RDA and the importance of authority control for catalogs, this paper aims to present the RDA and its origin and development, to contextualize the creation of authority records in descriptive cataloging and to present the use of RDA in recording of attributes and relationships of person, family and corporate body entities. It also presents the RDA and its relation with FRBR and FRAD conceptual models and the sections, chapters, attributes and relationships defined for persons, families and corporate bodies. Lastly, this paper highlights some differences between RDA and AACR2r and gives some considerations about the RDA implantation.

  16. RDA: Resource Description and Access: The new standard for metadata and resource discovery in the digital age

    Directory of Open Access Journals (Sweden)

    Carlo Bianchini

    2015-01-01

    Full Text Available RDA (Resource Description and Access is going to promote a great change. In fact, guidelines – rather than rules – are addressed to anyone wishes to describe and make accessible a cultural heritage collection or tout court a collection: librarians, archivists, curators and professionals in any other branch of knowledge. The work is organized in two parts: the former contains theoretical foundations of cataloguing (FRBR, ICP, semantic web and linked data, the latter a critical presentation of RDA guidelines. RDA aims to make possible creation of well-structured metadata for any kind of resources, reusable in any context and technological environment. RDA offers a “set of guidelines and instructions to create data for discovery of resources”. Guidelines stress four actions – to identify, to relate (from FRBR/FRAD user tasks and ICP, to represent and to discover – and a noun: resource. To identify entities of Group 1 and Group 2 of FRBR (Work, Expression, Manifestation, Item, Person, Family, Corporate Body; to relate entities of Group 1 and Group 2 of FRBR, by means of relationships. To enable users to represent and discover entities of Group 1 and Group 2 by means of their attributes and relationships. These last two actions are the reason of users’ searches, and users are the pinpoint of the process. RDA enables the discovery of recorded knowledge, that is any resource conveying information, any resources transmitting intellectual or artistic content by means of any kind of carrier and media. RDA is a content standard, not a display standard nor an encoding standard: it gives instructions to identify data and does not care about how display or encode data produced by guidelines. RDA requires an original approach, a metanoia, a deep change in the way we think about cataloguing. Innovations in RDA are many: it promotes interoperability between catalogs and other search tools, it adopts terminology and concepts of the Semantic Web, it

  17. El objeto de la catalogación en el marco de las RDAThe cataloging object in RDA context

    Directory of Open Access Journals (Sweden)

    Paola Picco

    2009-01-01

    Full Text Available El presente artículo aborda los cambios producidos en la catalogación a partir de la incorporación de las Tecnologías de la Información y Comunicación (TIC. Es en este contexto que surge el nuevo Código de Catalogación denominado Descripción y Acceso del Recurso (RDA que se presenta como un elemento normalizador de alcance internacional. El Código se sustenta en el modelo conceptual Requisitos Funcinales para Registros Bibliográficos (FRBR, que incorpora una nueva terminología donde se definen cuatro entidades (obra, expresión, manifestación e ítem que deben ser consideradas al momento de catalogar. El objetivo de esta contribución es analizar el objeto de catalogación en el marco de los cambios propuestos a partir de este nuevo modelo y a la luz de la próxima aparición del nuevo Código de Catalogación que se sustenta teóricamente sobre el mencionado modelo. Se destaca la identificación de la obra como elemento fundamental en el proceso de catalogación y al momento de establecer las relaciones entre las entidades de contenido y de presentación definidas en el modelo FRBR. Se manifiesta la necesidad de que los catalogadores realicen un trabajo interdisciplinario junto a otros profesionales en el proceso de identificación de las obras donde se haga énfasis en el trabajo de investigación superando la función tradicional de la mera transcripción de datos.The current article analyses the changes proposed in the process of cataloging since the introduction of the Technologies of Information and Communication (TIC. The new Cataloging Code denominated Resource Description and Access (RDA which is ready to be published is presented as a standardization element which will reach the international level. The Code is founded on Functional Requirements for Bibliographic Records (FRBR conceptual model, and it incorporates a new terminology, where four entities are defined (work, expression, manifestation and item. All of them

  18. Las últimas elecciones de la RDA. La puerta abierta hacia la reunificación alemana

    Directory of Open Access Journals (Sweden)

    Félix Gil Feito

    2012-10-01

    Full Text Available Las últimas elecciones celebradas el 18 de marzo de 1990 en la República Democrática Alemana (RDA fueron sin lugar a dudas una de las premisas fundamentales para que la Reunificación se produjera unos meses más tarde. Fundamental porque como reconocía el  propio Canciller de la República Federal Alemana (RFA, Helmut Kohl, era condición sine qua non que la RDA tuviera un régimen plenamente democrático con unas elecciones en las que todos los partidos políticos partieran con las mismas posibilidades. Sin que esto se produjera, las negociaciones que implicaban la ayuda económica necesaria para el bienestar de los ciudadanos del Este, así como avanzar y encaminar definitivamente la Reunificación con la formación de un nuevo gobierno en la RDA que sustituyera al SED después de cuarenta años de gobierno, la Reunificación de las dos Alemanias no hubiera llegado a fructificar nunca. Palabras clave: Reunificación alemana, elecciones, RDA , Helmut Kohl. _______________________Abstract:The last elections held on March 18, 1990 in the German Democratic Republic were without doubt one of the fundamental premises for the Reunification took place some months later. Critical because as acknowledged by the Chancellor of the FRG, Helmut Kohl, was a sine qua non that the GDR had a fully democratic regime with an election in which all political parties started with the same possibilities. Not that this should occur, the negotiations involving the financial support necessary for the welfare of the citizens of the East, and definitely the route ahead and Reunification with the formation of a new government in the GDR to replace the SED after forty years of government, the Reunification of the two Germanys had not come to fruition before.Keywords: German Reunification, election, RDA, Helmuth Kohl. 

  19. RDA, el nuevo código de catalogación: cambios y desafíos para su aplicación

    Directory of Open Access Journals (Sweden)

    Picco, Paola

    2012-03-01

    Full Text Available An in-depth study presenting the new cataloging code, RDA (Resource Description and Access and its relationship to technology. Its origin, what it represents and its implications for cataloging are analyzed. The state of bibliographic control and the foundations of the new cataloging code are reviewed. Details are presented concerning its structure, changes and advantages of being based on the FRBR model. Reference is also made to RDA’s adoption by the international community. Finally, the article sets forth the changes and doubts that RDA prompts. The exhaustive bibliographic review permits the identification of the most significant changes as well as the pros and cons. Likewise, a detailed study of the Functional Requirements of Bibliographic Records (FRBR and the Functional Requirements for Authority Data (FRAD enables a better understanding of the new code’s instructions, connections and relationships to them.

    Se estudia el nuevo código de catalogación Descripción y Acceso a los Recursos (RDA y su adecuación a la situación tecnológica de la actualidad. Se analiza la forma en que se originó, qué representa y qué puede significar para la catalogación y para el desarrollo de los catálogos. Se hace un breve recorrido por la situación del control bibliográfico y su relación con la tecnología para continuar con los antecedentes que le dan origen al nuevo código. Asimismo se estudian sus características, los cambios que propone y qué beneficios aporta el hecho de estar basado en el modelo FRBR. Se muestran sus implementaciones en la comunidad internacional. Y, por último, se destacan los aciertos y los desafíos que supone y las dudas más relevantes que suscita. Se realiza una amplia revisión bibliográfica que ha permitido identificar los aspectos más importantes de este cambio de normalización catalográfica así como sus aciertos e inconvenientes. Asimismo se lleva a cabo un análisis minucioso de los Requisito

  20. Implementing RDA Data Citation Recommendations: Case Study in South Africa

    Science.gov (United States)

    Hugo, Wim

    2016-04-01

    SAEON operates a shared research data infrastructure for its own data sets and for clients and end users in the Earth and Environmental Sciences domain. SAEON has a license to issue Digital Object Identifiers via DataCite on behalf of third parties, and have recently concluded development work to make a universal data deposit, description, and DOI minting facility available. This facility will be used to develop a number of end user gateways, including DataCite South Africa (in collaboration with National Research Foundation and addressing all grant-funded research in the country), DIRISA (Data-intensive Research Infrastructure for South Africa - in collaboration with Meraka Institute and Department of Science and Technology), and SASDI (South African Spatial Data Infrastructure). The RDA recently published Data Citation Recommendations [1], and this was used as a basis for specification of Digital Object Identifier implementation, raising two significant challenges: 1. Synchronisation of frequently harvested meta-data sets where version management practice did not align with the RDA recommendations, and 2. Handling sub-sets of and queries on large, continuously updated data sets. In the first case, we have developed a set of tests that determine the logical course of action when discrepancies are found during synchronization, and we have incorporated these into meta-data harvester configurations. Additionally, we have developed a state diagram and attendant workflow for meta-data that includes problem states emanating from DOI management, reporting services for data depositors, and feedback to end users in respect of synchronisation issues. In the second case, in the absence of firm guidelines from DataCite, we are seeking community consensus and feedback on an approach that caches all queries performed and subsets derived from data, and provide these with anchor-style extensions linked to the dataset's original DOI. This allows extended DOIs to resolve to a meta

  1. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  2. Council Adopts New AERA Code of Ethics: Ethics Committee to Emphasize Ethics Education

    Science.gov (United States)

    Herrington, Carolyn D.

    2011-01-01

    At its February 2011 meeting, the AERA Council adopted unanimously a new Code of Ethics. The Code articulates a set of standards for education researchers in education and provides principles and guidance by which they can build ethical practices in professional, scholarly, and scientific activities. The Code reflects the Association's strong…

  3. rda brīvības un intelektuālā īpašuma aizsardzības līdzsvarošana ACTA

    OpenAIRE

    Bulzgis, Raivo

    2012-01-01

    Attīstoties intelektuālā īpašuma tiesībām un tā tiesiskajam regulējumam, tiek ierobežotas personu tiesības uz vārda brīvību. Ņemot vērā, ka tiek pieņemts jauns starptautiskais līgums šajā jomā, tad šī darba tēma ir aktuāla visā pasaulē. Darba mērķis ir noskaidrot, intelektuālā īpašuma un vārda brīvības robežas. Izvirzītā mērķa realizēšanai darba autors izvirzīja vairākus uzdevumus: izpētīt jēdzienus “vārda brīvība” un “intelektuālais īpašums”, analizēt šo jēdzienu robežas, kā arī iepazī...

  4. Produção científica acerca do novo código de catalogação RDA: análise bibliométrica de 2010 a 2014

    Directory of Open Access Journals (Sweden)

    Raquel Bernadete Machado

    2015-05-01

    Full Text Available O Resource Description and Access (RDA é o novo código de catalogação desenvolvido para substituir o Código de Catalogação Anglo-Americano, segunda edição (AACR2 com a finalidade de melhorar a recuperação do conhecimento registrado nas bibliotecas. Nessa perspectiva, o presente artigo tem por objetivo identificar na literatura científica tendências na área de catalogação acerca do tema RDA e analisar quantitativamente o crescimento das publicações, os principais autores, os idiomas de publicação, as instituições de filiação dos autores e os países colaboradores. O estudo de caráter bibliométrico analisou artigos de periódicos publicados entre 2010 e 2014 em duas bases de dados de abrangência Os dados apresentados neste estudo permitiram verificar que a produção científica sobre o novo código RDA configura-se, atualmente, bastante dispersa no que se refere aos autores e periódicos. Por outro lado, foi identificado que o idioma inglês é o mais presente nas publicações, assim como a grande concentração de trabalhos acontece nos Estados Unidos. Verifica-se assim que, por ter sido de iniciativa daquele país realizar testes com o RDA, naturalmente os autores americanos publicam mais sobre o assunto. A presente pesquisa apresentou um recorte temporal mostrando algumas tendências para a área de catalogação, destacando-se a produção científica sobre o RDA.

  5. Is Meeting the Recommended Dietary Allowance (RDA) for Protein Related to Body Composition among Older Adults?: Results from the Cardiovascular Health of Seniors and Built Environment Study.

    Science.gov (United States)

    Beasley, J M; Deierlein, A L; Morland, K B; Granieri, E C; Spark, A

    2016-01-01

    Studies suggest protein intake may be associated with lower body weight, but protein has also been associated with preservation of lean body mass. Understanding the role of protein in maintaining health for older adults is important for disease prevention among this population. Cross-sectional study of the relationship of dietary protein on body composition. New York City community centers. 1,011 Black, White, and Latino urban men and women 60-99 years of age. Protein intake was assessed using two interviewer-administered 24-hour recalls, and body composition was assessed using bioelectrical impedance analysis (BIA) of fat mass (kg) (FM), fat free mass (kg) (FFM), and impedance resistance (Ohms). Indices of FM and FFM were calculated by dividing BIA measurements by height squared (m2), and percent FFM was calculated by dividing FFM by the sum of FM and FFM. Log linear models adjusting for age (continuous), race/ethnicity, education, physical activity (dichotomized at the median), hypertension, diabetes, and total calories (continuous). Just 33% of women and 50% of men reported meeting the RDA for protein. Both fat free mass index (FFMI) and fat mass index (FMI) were negatively associated with meeting the RDA for protein (Women: FFMI -1.78 95%CI [-2.24, -1.33], FMI -4.12 95% CI [-4.82, -3.42]; Men: FFMI -1.62 95% CI [-2.32, -0.93] FMI -1.80 95% CI [-2.70, -0.89]). After accounting for confounders, women and men consuming at least 0.8 g/kg/day had a 6.2% (95% CI: 5.0%, 7.4%) and a 3.2% (95% CI 1.1%, 5.3%) higher percent fat free mass, respectively. FFM, FFMI, FM, and FMI were inversely related to meeting the RDA for protein. Meeting the RDA for protein of at least 0.8g/kg/day was associated with a higher percentage of fat free mass among older adults. These results suggest meeting the protein recommendations of at least 0.8 g/kg/day may help to promote lower overall body mass, primarily through loss of fat mass rather than lean mass.

  6. Zusammenführen was zusammengehört – Intellektuelle und automatische Erfassung von Werken nach RDA

    Directory of Open Access Journals (Sweden)

    Barbara Pfeifer

    2016-12-01

    Full Text Available Der Beitrag will die Erschließungspraxis der Deutschen Nationalbibliothek im ersten Implementierungsschritt der RDA zur Angabe der Werkebene darstellen, Erfahrungen zur grundsätzlichen Vorgehensweise und zu Sonderregelungen vermitteln und einen Ausblick in die Zukunft bieten. Die Angabe der Werkebene als Kernelement des neuen Standards wird bei der Erschließung von Ressourcen in der Deutschen Nationalbibliothek immer berücksichtigt. Bei der intellektuellen Erschließung von Monografien wird das Element als Normdatensatz oder als textuelle Angabe im Katalogisat erfasst. Allerdings setzt die derzeitige Praxis auch auf zukünftige automatische Clusterverfahren. Der erreichte Stand zu Algorithmen und Testläufen für das Werkclustering soll ebenso aufgezeigt werden, wie die daraus resultierenden Entscheidungen und die weitere Perspektive. Von zentraler Bedeutung ist die Frage, ob und unter welchen Bedingungen auf der Basis der Werkcluster Normdatensätze für die Gemeinsame Normdatei (GND generiert werden können, um sie im deutschen Sprachraum kooperativ zu nutzen und sie in der Linked-Data-Cloud zu vernetzen.   This article explains how works are catalogued at the Deutsche Nationalbibliothek (German National Library in the first implementation step of RDA. It discusses experiences with the general approach as well as some special rules and points out future prospects. When cataloguing resources at the Deutsche Nationalbibliothek, the work entity is always included as the work elements are core elements of the new standard. If the cataloguing is done intellectually, the work entity is described either as a link to an authority record or as a textual string within the description of the resource. However, the current practice builds also on automatic clustering of works in the future. The state of the art of automatic cluster algorithms and their evaluation and testing is described as well as the next steps for the future. An essential question

  7. Recent trends in coding theory and its applications

    CERN Document Server

    Li, Wen-Ching Winnie

    2007-01-01

    Coding theory draws on a remarkable selection of mathematical topics, both pure and applied. The various contributions in this volume introduce coding theory and its most recent developments and applications, emphasizing both mathematical and engineering perspectives on the subject. This volume covers four important areas in coding theory: algebraic geometry codes, graph-based codes, space-time codes, and quantum codes. Both students and seasoned researchers will benefit from the extensive and self-contained discussions of the development and recent progress in these areas.

  8. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  9. Focus Group Research on the Implications of Adopting the Unified English Braille Code

    Science.gov (United States)

    Wetzel, Robin; Knowlton, Marie

    2006-01-01

    Five focus groups explored concerns about adopting the Unified English Braille Code. The consensus was that while the proposed changes to the literary braille code would be minor, those to the mathematics braille code would be much more extensive. The participants emphasized that "any code that reduces the number of individuals who can access…

  10. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  11. El internacionalismo, la solidaridad y el interés mutuo: encuentros entre cubanos, africanos, y alemanes de la RDA

    Directory of Open Access Journals (Sweden)

    Berthold Unfried

    Full Text Available Resumen La asistencia militar cubana en Angola y Etiopía es muy conocida en el ámbito internacional. Sin embargo, la historia de la asistencia civil aún no ha sido escrita en toda su magnitud. Esta contribución esboza una comparación entre el "internacionalismo" cubano y el sector de la "solidaridad" o "ayuda socialista" de la República Democrática Alemana (RDA. ¿Qué significaron y cuáles fueron las terminologías y prácticas correspondientes de los alemanes? ¿Qué tipos de internacionalismo y solidaridad de práctica pueden identificarse? ¿Cómo fueron las relaciones entre cubanos y alemanes con sus contrapartes etíopes y angolanos y con el pueblo de esos países y qué se transfirió en ese encuentro triangular? La contribución se basa en material de los archivos cubanos y alemanes, así como en entrevistas.

  12. Coding and Comprehension in Skilled Reading and Implications for Reading Instruction.

    Science.gov (United States)

    Perfetti, Charles A.; Lesgold, Alan M.

    A view of skilled reading is suggested that emphasizes an intimate connection between coding and comprehension. It is suggested that skilled comprehension depends on a highly refined facility for generating and manipulating language codes, especially at the phonetic/articulatory level. The argument is developed that decoding expertise should be a…

  13. Enhancement of Unequal Error Protection Properties of LDPC Codes

    Directory of Open Access Journals (Sweden)

    Poulliat Charly

    2007-01-01

    Full Text Available It has been widely recognized in the literature that irregular low-density parity-check (LDPC codes exhibit naturally an unequal error protection (UEP behavior. In this paper, we propose a general method to emphasize and control the UEP properties of LDPC codes. The method is based on a hierarchical optimization of the bit node irregularity profile for each sensitivity class within the codeword by maximizing the average bit node degree while guaranteeing a minimum degree as high as possible. We show that this optimization strategy is efficient, since the codes that we optimize show better UEP capabilities than the codes optimized for the additive white Gaussian noise channel.

  14. Evaluation of Code Blue Implementation Outcomes

    Directory of Open Access Journals (Sweden)

    Bengü Özütürk

    2015-09-01

    Full Text Available Aim: In this study, we aimed to emphasize the importance of Code Blue implementation and to determine deficiencies in this regard. Methods: After obtaining the ethics committee approval, 225 patient’s code blue call data between 2012 and 2014 January were retrospectively analyzed. Age and gender of the patients, date and time of the call and the clinics giving Code Blue, the time needed for the Code Blue team to arrive, the rates of false Code Blue calls, reasons for Code Blue calls and patient outcomes were investigated. Results: A total of 225 patients (149 male, 76 female were evaluated in the study. The mean age of the patients was 54.1 years. 142 (67.2% Code Blue calls occurred after hours and by emergency unit. The mean time for the Code Blue team to arrive was 1.10 minutes. Spontaneous circulation was provided in 137 patients (60.8%; 88 (39.1% died. The most commonly identified possible causes were of cardiac origin. Conclusion: This study showed that Code Blue implementation with a professional team within an efficient and targeted time increase the survival rate. Therefore, we conclude that the application of Code Blue carried out by a trained team is an essential standard in hospitals. (The Medical Bulletin of Haseki 2015; 53:204-8

  15. Flexible digital modulation and coding synthesis for satellite communications

    Science.gov (United States)

    Vanderaar, Mark; Budinger, James; Hoerig, Craig; Tague, John

    1991-01-01

    An architecture and a hardware prototype of a flexible trellis modem/codec (FTMC) transmitter are presented. The theory of operation is built upon a pragmatic approach to trellis-coded modulation that emphasizes power and spectral efficiency. The system incorporates programmable modulation formats, variations of trellis-coding, digital baseband pulse-shaping, and digital channel precompensation. The modulation formats examined include (uncoded and coded) binary phase shift keying (BPSK), quatenary phase shift keying (QPSK), octal phase shift keying (8PSK), 16-ary quadrature amplitude modulation (16-QAM), and quadrature quadrature phase shift keying (Q squared PSK) at programmable rates up to 20 megabits per second (Mbps). The FTMC is part of the developing test bed to quantify modulation and coding concepts.

  16. The Pierre Auger Research and Development Array (RDA in southeastern Colorado – R&D for a giant ground array

    Directory of Open Access Journals (Sweden)

    Thompson J.

    2013-06-01

    Full Text Available The Pierre Auger Research and Development Array (RDA was originally designed to be the precursor of the northern Auger observatory, a hybrid array of 4400 surface detector stations and 39 fluorescence telescopes deployed over 20,000 square kilometers. It is conceived as a test bed aiming at validating an improved and more cost-effective 1-PMT surface detector design and a new peer-to-peer communication system. The array of ten surface detector stations and ten communication-only stations is currently being deployed in southeastern Colorado and will be operated at least until late 2013. It is configured in such a way that it allows testing of a new peer-to-peer communication protocol, as well as a new surface detector electronics design with a larger dynamic range aiming at reducing the distance from the shower core where saturation is observed. All these developments are expected in the short term to improve the performance of the Pierre Auger Observatory and enable future enhancements. In the longer term, it is hoped that some of these new developments may contribute to the design of a next-generation giant ground array.

  17. Cooperation of experts' opinion, experiment and computer code development

    International Nuclear Information System (INIS)

    Wolfert, K.; Hicken, E.

    The connection between code development, code assessment and confidence in the analysis of transients will be discussed. In this manner, the major sources of errors in the codes and errors in applications of the codes will be shown. Standard problem results emphasize that, in order to have confidence in licensing statements, the codes must be physically realistic and the code user must be qualified and experienced. We will discuss why there is disagreement between the licensing authority and vendor concerning assessment of the fullfillment of safety goal requirements. The answer to the question lies in the different confidence levels of the assessment of transient analysis. It is expected that a decrease in the disagreement will result from an increased confidence level. Strong efforts will be made to increase this confidence level through improvements in the codes, experiments and related organizational strcutures. Because of the low probability for loss-of-coolant-accidents in the nuclear industry, assessment must rely on analytical techniques and experimental investigations. (orig./HP) [de

  18. The changing emphases in health physics

    International Nuclear Information System (INIS)

    Denham, D.H.; Kathren, R.L.

    1987-11-01

    This paper explores the changing emphases in health physics as evidenced by the subject matter of published papers in four primary English language journals of interest to health physicists. Articles from each journal were first grouped by subject and date of publication and were then compiled according to the list of professional domains practiced by health physicists. Five domains of practice were examined, measurements including dosimetry and environmental monitoring; regulations and standards; facilities and equipment including shielding, ventilation, and instrumentation; operations and procedures; and education and training. 2 tabs

  19. Implementing a modular system of computer codes

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.

    1983-07-01

    A modular computation system has been developed for nuclear reactor core analysis. The codes can be applied repeatedly in blocks without extensive user input data, as needed for reactor history calculations. The primary control options over the calculational paths and task assignments within the codes are blocked separately from other instructions, admitting ready access by user input instruction or directions from automated procedures and promoting flexible and diverse applications at minimum application cost. Data interfacing is done under formal specifications with data files manipulated by an informed manager. This report emphasizes the system aspects and the development of useful capability, hopefully informative and useful to anyone developing a modular code system of much sophistication. Overall, this report in a general way summarizes the many factors and difficulties that are faced in making reactor core calculations, based on the experience of the authors. It provides the background on which work on HTGR reactor physics is being carried out

  20. Advertising energy saving programs: The potential environmental cost of emphasizing monetary savings.

    Science.gov (United States)

    Schwartz, Daniel; Bruine de Bruin, Wändi; Fischhoff, Baruch; Lave, Lester

    2015-06-01

    Many consumers have monetary or environmental motivations for saving energy. Indeed, saving energy produces both monetary benefits, by reducing energy bills, and environmental benefits, by reducing carbon footprints. We examined how consumers' willingness and reasons to enroll in energy-savings programs are affected by whether advertisements emphasize monetary benefits, environmental benefits, or both. From a normative perspective, having 2 noteworthy kinds of benefit should not decrease a program's attractiveness. In contrast, psychological research suggests that adding external incentives to an intrinsically motivating task may backfire. To date, however, it remains unclear whether this is the case when both extrinsic and intrinsic motivations are inherent to the task, as with energy savings, and whether removing explicit mention of extrinsic motivation will reduce its importance. We found that emphasizing a program's monetary benefits reduced participants' willingness to enroll. In addition, participants' explanations about enrollment revealed less attention to environmental concerns when programs emphasized monetary savings, even when environmental savings were also emphasized. We found equal attention to monetary motivations in all conditions, revealing an asymmetric attention to monetary and environmental motives. These results also provide practical guidance regarding the positioning of energy-saving programs: emphasize intrinsic benefits; the extrinsic ones may speak for themselves. (c) 2015 APA, all rights reserved).

  1. The Protection of the Right to Life in the New Criminal Code

    Directory of Open Access Journals (Sweden)

    Lavinia Mihaela Vladila

    2011-05-01

    Full Text Available The present article illustrates general aspects of the actual regulation on the protection of the rightto life, in comparison with the new regulation, which shall be analyzed more carefully. The paper is based onthe study of the new Criminal Code, emphasizing the most important differences between the actualregulation and the new Criminal Code, and the few texts elaborated in this area. The approach of the subjectis a more practical one, because few texts were written about the new Criminal Code, at the moment of itsentrance into force, the doctrinaires will need to know the differences and the innovations brought by it. Theresult is meant to increase the understanding of the new text and to enrich the analysis and synthesis of thenew Criminal Code.

  2. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    Glaeser, H.

    1995-01-01

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  3. Design criteria and pressure vessel codes - an American view

    International Nuclear Information System (INIS)

    Tuppeny, W.H.

    1975-01-01

    To the pressure vessel designer, codes and criteria represent the common ground where the stress analyst and the metallurgist must interact and evolve rules and procedures which will ensure safety and open-ended responsiveness to technological, economic, and environmental change. The paper briefly discusses the evolution and rationale behind the current ASME code sections -emphasizing those portions applicable to designs operating in the creep range. The author then proposes a plan of action so that the analysts and materials people can make optimum use of time and resources, and evolve data and design criteria which will be responsive to changing technology and the economic and safety requirements of the future. (author)

  4. Fire-accident analysis code (FIRAC) verification

    International Nuclear Information System (INIS)

    Nichols, B.D.; Gregory, W.S.; Fenton, D.L.; Smith, P.R.

    1986-01-01

    The FIRAC computer code predicts fire-induced transients in nuclear fuel cycle facility ventilation systems. FIRAC calculates simultaneously the gas-dynamic, material transport, and heat transport transients that occur in any arbitrarily connected network system subjected to a fire. The network system may include ventilation components such as filters, dampers, ducts, and blowers. These components are connected to rooms and corridors to complete the network for moving air through the facility. An experimental ventilation system has been constructed to verify FIRAC and other accident analysis codes. The design emphasizes network system characteristics and includes multiple chambers, ducts, blowers, dampers, and filters. A larger industrial heater and a commercial dust feeder are used to inject thermal energy and aerosol mass. The facility is instrumented to measure volumetric flow rate, temperature, pressure, and aerosol concentration throughout the system. Aerosol release rates and mass accumulation on filters also are measured. We have performed a series of experiments in which a known rate of thermal energy is injected into the system. We then simulated this experiment with the FIRAC code. This paper compares and discusses the gas-dynamic and heat transport data obtained from the ventilation system experiments with those predicted by the FIRAC code. The numerically predicted data generally are within 10% of the experimental data

  5. Runtime Detection of C-Style Errors in UPC Code

    Energy Technology Data Exchange (ETDEWEB)

    Pirkelbauer, P; Liao, C; Panas, T; Quinlan, D

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the global address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.

  6. PREREM: an interactive data preprocessing code for INREM II. Part I: user's manual. Part II: code structure

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, M.T.; Fields, D.E.

    1981-05-01

    PREREM is an interactive computer code developed as a data preprocessor for the INREM-II (Killough, Dunning, and Pleasant, 1978a) internal dose program. PREREM is intended to provide easy access to current and self-consistent nuclear decay and radionuclide-specific metabolic data sets. Provision is made for revision of metabolic data, and the code is intended for both production and research applications. Documentation for the code is in two parts. Part I is a user's manual which emphasizes interpretation of program prompts and choice of user input. Part II stresses internal structure and flow of program control and is intended to assist the researcher who wishes to revise or modify the code or add to its capabilities. PREREM is written for execution on a Digital Equipment Corporation PDP-10 System and much of the code will require revision before it can be run on other machines. The source program length is 950 lines (116 blocks) and computer core required for execution is 212 K bytes. The user must also have sufficient file space for metabolic and S-factor data sets. Further, 64 100 K byte blocks of computer storage space are required for the nuclear decay data file. Computer storage space must also be available for any output files produced during the PREREM execution. 9 refs., 8 tabs.

  7. Iterative Decoding of Concatenated Codes: A Tutorial

    Directory of Open Access Journals (Sweden)

    Phillip A. Regalia

    2005-05-01

    Full Text Available The turbo decoding algorithm of a decade ago constituted a milestone in error-correction coding for digital communications, and has inspired extensions to generalized receiver topologies, including turbo equalization, turbo synchronization, and turbo CDMA, among others. Despite an accrued understanding of iterative decoding over the years, the “turbo principle” remains elusive to master analytically, thereby inciting interest from researchers outside the communications domain. In this spirit, we develop a tutorial presentation of iterative decoding for parallel and serial concatenated codes, in terms hopefully accessible to a broader audience. We motivate iterative decoding as a computationally tractable attempt to approach maximum-likelihood decoding, and characterize fixed points in terms of a “consensus” property between constituent decoders. We review how the decoding algorithm for both parallel and serial concatenated codes coincides with an alternating projection algorithm, which allows one to identify conditions under which the algorithm indeed converges to a maximum-likelihood solution, in terms of particular likelihood functions factoring into the product of their marginals. The presentation emphasizes a common framework applicable to both parallel and serial concatenated codes.

  8. CONSIDERATIONS CONCERNING THE RECOGNITION OF FOREIGN JUDGMENTS IN THE NEW BRAZILIAN CIVIL PROCEDURE CODE

    Directory of Open Access Journals (Sweden)

    Humberto Dalla Bernardina de Pinho

    2016-06-01

    Full Text Available The presente study aims to critically analyze the recognition of foreign judgments in the new Brazilian Civil Procedure Code, highlighting the changes brought by the new legislation. The international judicial cooperation was emphasized in the new Code, which was made clear by the prevision of the institute called “direct aid” (“auxílio direto”, the detailed regulation of the recognition of foreign judgements, including the possibility of concession of interim measures and the recognition of foreign interim measures.

  9. The Cost of Enforcing Building Energy Codes: Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Vine, Ed [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sturges, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rosenquist, Greg [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-04-01

    The purpose of this literature review is to summarize key findings regarding the costs associated with enforcing building energy code compliance—primarily focusing on costs borne by local government. The review takes into consideration over 150 documents that discuss, to some extent, code enforcement. This review emphasizes those documents that specifically focus on costs associated with energy code enforcement. Given the low rates of building energy code compliance that have been reported in existing studies, as well as the many barriers to both energy code compliance and enforcement, this study seeks to identify the costs of initiatives to improve compliance and enforcement. Costs are reported primarily as presented in the original source. Some costs are given on a per home or per building basis, and others are provided for jurisdictions of a certain size. This literature review gives an overview of state-based compliance rates, barriers to code enforcement, and U.S. Department of Energy (DOE) and key stakeholder involvement in improving compliance with building energy codes. In addition, the processes and costs associated with compliance and enforcement of building energy codes are presented. The second phase of this study, which will be presented in a different report, will consist of surveying 34 experts in the building industry at the national and state or local levels in order to obtain additional cost information, building on the findings from the first phase, as well as recommendations for where to most effectively spend money on compliance and enforcement.

  10. Insect olfactory coding and memory at multiple timescales.

    Science.gov (United States)

    Gupta, Nitin; Stopfer, Mark

    2011-10-01

    Insects can learn, allowing them great flexibility for locating seasonal food sources and avoiding wily predators. Because insects are relatively simple and accessible to manipulation, they provide good experimental preparations for exploring mechanisms underlying sensory coding and memory. Here we review how the intertwining of memory with computation enables the coding, decoding, and storage of sensory experience at various stages of the insect olfactory system. Individual parts of this system are capable of multiplexing memories at different timescales, and conversely, memory on a given timescale can be distributed across different parts of the circuit. Our sampling of the olfactory system emphasizes the diversity of memories, and the importance of understanding these memories in the context of computations performed by different parts of a sensory system. Published by Elsevier Ltd.

  11. The GBS code for tokamak scrape-off layer simulations

    International Nuclear Information System (INIS)

    Halpern, F.D.; Ricci, P.; Jolliet, S.; Loizu, J.; Morales, J.; Mosetto, A.; Musil, F.; Riva, F.; Tran, T.M.; Wersal, C.

    2016-01-01

    We describe a new version of GBS, a 3D global, flux-driven plasma turbulence code to simulate the turbulent dynamics in the tokamak scrape-off layer (SOL), superseding the code presented by Ricci et al. (2012) [14]. The present work is driven by the objective of studying SOL turbulent dynamics in medium size tokamaks and beyond with a high-fidelity physics model. We emphasize an intertwining framework of improved physics models and the computational improvements that allow them. The model extensions include neutral atom physics, finite ion temperature, the addition of a closed field line region, and a non-Boussinesq treatment of the polarization drift. GBS has been completely refactored with the introduction of a 3-D Cartesian communicator and a scalable parallel multigrid solver. We report dramatically enhanced parallel scalability, with the possibility of treating electromagnetic fluctuations very efficiently. The method of manufactured solutions as a verification process has been carried out for this new code version, demonstrating the correct implementation of the physical model.

  12. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    International Nuclear Information System (INIS)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun; Sasaki, Masahide

    2004-01-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decoding in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques

  13. ITER Dynamic Tritium Inventory Modeling Code

    International Nuclear Information System (INIS)

    Cristescu, Ioana-R.; Doerr, L.; Busigin, A.; Murdoch, D.

    2005-01-01

    A tool for tritium inventory evaluation within each sub-system of the Fuel Cycle of ITER is vital, with respect to both the process of licensing ITER and also for operation. It is very likely that measurements of total tritium inventories may not be possible for all sub-systems, however tritium accounting may be achieved by modeling its hold-up within each sub-system and by validating these models in real-time against the monitored flows and tritium streams between the systems. To get reliable results, an accurate dynamic modeling of the tritium content in each sub-system is necessary. In order to optimize the configuration and operation of the ITER fuel cycle, a dynamic fuel cycle model was developed progressively in the decade up to 2000-2001. As the design for some sub-systems from the fuel cycle (i.e. Vacuum pumping, Neutral Beam Injectors (NBI)) have substantially progressed meanwhile, a new code developed under a different platform to incorporate these modifications has been developed. The new code is taking over the models and algorithms for some subsystems, such as Isotope Separation System (ISS); where simplified models have been previously considered, more detailed have been introduced, as for the Water Detritiation System (WDS). To reflect all these changes, the new code developed inside EU participating team was nominated TRIMO (Tritium Inventory Modeling), to emphasize the use of the code on assessing the tritium inventory within ITER

  14. The Era of Dušan’s Code

    Directory of Open Access Journals (Sweden)

    Đorđe Bubalo

    2015-12-01

    At a point no earlier than the second half of the 17th century, a separate recension of Dušan’s Code was created in order to facilitate the adaptation and use of its legal material for the regulation of those legal relations that the Serbian ecclesiastical hierarchy or the local self-governing authorities had kept within their own jurisdictions under foreign rule. The majority of the copies of this new, younger recension was created and enacted in the circle of the Serbian ecclesiastical hierarchy and the subjects of the Habsburg monarchy after the Great Exodus. Not only did the Code provide positive legal material, but its mere existence and authority also helped the efforts of the Serbian hierarchy in the Habsburg monarchy to emphasize the tradition of Serbian statehood, as well as its tendencies toward a renewal of state independence.

  15. Mongolia; Report on the Observance of Standards and Codes-Fiscal Transparency

    OpenAIRE

    International Monetary Fund

    2001-01-01

    This report provides an assessment of fiscal transparency practices in Mongolia against the requirements of the IMF Code of Good Practices on Fiscal Transparency. This paper analyzes the government's participation in the financial and nonfinancial sectors of the economy. Executive Directors appreciated the achievements, and stressed the need for improvements in the areas of fiscal transparency. They emphasized the need for addressing weaknesses of fiscal data, maintaining a legal framework fo...

  16. The Usage of 2D Codes in Marketing Practices

    Directory of Open Access Journals (Sweden)

    Toni Podmanicki

    2011-07-01

    Full Text Available Barcodes, which are used for the labelling and identification of products, have been used as the foundation for the development of new symbols, two-dimensional barcodes (usually called 2D codes. These codes are capable of receiving large amounts of data in a small area, and data stored in them can be read by means of mobile devices. They usually contain information such as web addresses, text, contacts and similar data that encourage users to interact in order to obtain the desired information, entertainment, discount, reservation, and even do their shopping. The possibility of connecting the physical and digital world by means of 2D codes has led marketing professionals to face new challenges in the development of strategies in mobile marketing. Many companies recognized the potential of the above technology very early, in its initial phase, and they use it now in their activities. This paper aims to emphasize the importance of knowing this technology and its advantages by providing examples in marketing practices.

  17. Pronouncements and Denunciations: An Analysis of State Press Association Ethics Codes from the 1920s.

    Science.gov (United States)

    Cronin, Mary M.; McPherson, James B.

    1995-01-01

    Discusses how four situational concerns (the rise of press agents, the fallout from World War I propaganda, sensationalism's resurgence, and editorial independence) contributed to a noticeable decline of the public's trust in the press. Shows how, as a result, 12 state press association ethics codes from the 1920s consistently emphasized the…

  18. Resource description and access 2013 revision

    CERN Document Server

    2013-01-01

    This e-book contains the 2013 Revision of RDA: Resource Description and Access, and includes the July 2013 Update. This e-book offers links within the RDA text and the capability of running rudimentary searches of RDA, but please note that this e-book does not have the full range of content or functionality provided by the subscription product RDA Toolkit. Included: A full accumulation of RDA- the revision contains a full set of all current RDA instructions. It replaces the previous version of RDA Print as opposed to being an update packet to that version. RDA has gone through many changes sin

  19. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  20. Introduzione

    Directory of Open Access Journals (Sweden)

    Mauro Guerrini

    2018-01-01

    third was as follows: “What are the main challenges and the principal questions in the writing of an international cataloguing code, in the context of an information environment in rapid evolution?” Her response was: “Assuring cultural diversity may be the greatest challenge.  Keeping our users in mind as the focus for our work, it is important to provide bibliographic and authority information that meets their needs and is presented in a form that is easily understood.  That means writing in their language, in their script, and using terminology that they understand.”[1] Barbara continued that there need to be options for cultural diversity and naming conventions used in different countries.  Indeed, the international cataloging code RDA (Resource Description and Access, likewise can have local adjustments to meet local needs of users. As an example: since 2011, librarians in the German speakingcommunities (or DACH, have been engaged in a collaborative project involving Germany, Austria and Switzerland.[2] They have agreed to cooperate fully in the drafting and evolution of international guidelines without thereby being "traitors" to their long and prestigious cataloging tradition, and have become active partners of the RSC (RDA Steering Committee, with a seat on that Committee. Librarians in other European countries have adopted RDAorare working with the RSC in this direction. For example, starting 1st January 2019, Spain will also adopt the RDA guidelines (their decision was issued on 4th November 2016.  In 2017, the network URBE (Roman Ecclesiastical Libraries Union and the Vatican Library adopted RDA. We are in Italy, and I pose a specific question: does it still make sense to have a national code in a global context (for better or for worse in which the search of bibliographic resources can be done from anywhere in the world to any library? Or, is it more useful and more preferable to participate in the drafting of international guidelines, namely RDA

  1. Dynamic divisive normalization predicts time-varying value coding in decision-related circuits.

    Science.gov (United States)

    Louie, Kenway; LoFaro, Thomas; Webb, Ryan; Glimcher, Paul W

    2014-11-26

    Normalization is a widespread neural computation, mediating divisive gain control in sensory processing and implementing a context-dependent value code in decision-related frontal and parietal cortices. Although decision-making is a dynamic process with complex temporal characteristics, most models of normalization are time-independent and little is known about the dynamic interaction of normalization and choice. Here, we show that a simple differential equation model of normalization explains the characteristic phasic-sustained pattern of cortical decision activity and predicts specific normalization dynamics: value coding during initial transients, time-varying value modulation, and delayed onset of contextual information. Empirically, we observe these predicted dynamics in saccade-related neurons in monkey lateral intraparietal cortex. Furthermore, such models naturally incorporate a time-weighted average of past activity, implementing an intrinsic reference-dependence in value coding. These results suggest that a single network mechanism can explain both transient and sustained decision activity, emphasizing the importance of a dynamic view of normalization in neural coding. Copyright © 2014 the authors 0270-6474/14/3416046-12$15.00/0.

  2. The effects of dietary protein intake on appendicular lean mass and muscle function in elderly men

    DEFF Research Database (Denmark)

    Mitchell, Cameron J; Milan, Amber M; Mitchell, Sarah M

    2017-01-01

    Background: The Recommended Daily Allowance (RDA) for protein intake in the adult population is widely promoted as 0.8 g · kg-1 · d-1 Aging may increase protein requirements, particularly to maintain muscle mass.Objective: We investigated whether controlled protein consumption at the current RDA...... or twice the RDA (2RDA) affects skeletal muscle mass and physical function in elderly men.Design: In this parallel-group randomized trial, 29 men aged >70 y [mean ± SD body mass index (in kg/m2): 28.3 ± 4.2] were provided with a complete diet containing either 0.8 (RDA) or 1.6 (2RDA) g protein · kg-1 · d-1...... energy balance (mean ± SD RDA: 209 ± 213 kcal/d; 2RDA 145 ± 214 kcal/d; P= 0.427 for difference between the groups). In comparison with RDA, whole-body lean mass increased in 2RDA (P = 0.001; 1.49 ± 1.30 kg, P

  3. A Classroom Research Skills Development Emphasizing Data Analysis and Result of SSRU Students by RBL

    Science.gov (United States)

    Waree, Chaiwat

    2017-01-01

    The purpose of the study is the learning using research as a base. To strengthen the skills of classroom research Emphasizing Data Analysis and Result and to study the development of research skills in the class Emphasizing Data Analysis and Result of SSRU' Students by learning using research base. The target group are students in the 2nd semester…

  4. Mixture block coding with progressive transmission in packet video. Appendix 1: Item 2. M.S. Thesis

    Science.gov (United States)

    Chen, Yun-Chung

    1989-01-01

    Video transmission will become an important part of future multimedia communication because of dramatically increasing user demand for video, and rapid evolution of coding algorithm and VLSI technology. Video transmission will be part of the broadband-integrated services digital network (B-ISDN). Asynchronous transfer mode (ATM) is a viable candidate for implementation of B-ISDN due to its inherent flexibility, service independency, and high performance. According to the characteristics of ATM, the information has to be coded into discrete cells which travel independently in the packet switching network. A practical realization of an ATM video codec called Mixture Block Coding with Progressive Transmission (MBCPT) is presented. This variable bit rate coding algorithm shows how a constant quality performance can be obtained according to user demand. Interactions between codec and network are emphasized including packetization, service synchronization, flow control, and error recovery. Finally, some simulation results based on MBCPT coding with error recovery are presented.

  5. An Example Emphasizing Mass-Volume Relationships for Problem Solving in Soils

    Science.gov (United States)

    Heitman, J. L.; Vepraskas, M. J.

    2009-01-01

    Mass-volume relationships are a useful tool emphasized for problem solving in many geo-science and engineering applications. These relationships also have useful applications in soil science. Developing soils students' ability to utilize mass-volume relationships through schematic diagrams of soil phases (i.e., air, water, and solid) can help to…

  6. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    Science.gov (United States)

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  7. Case-crossover analysis of heat-coded deaths and vulnerable subpopulations: Oklahoma, 1990-2011

    Science.gov (United States)

    Moore, Brianna F.; Brooke Anderson, G.; Johnson, Matthew G.; Brown, Sheryll; Bradley, Kristy K.; Magzamen, Sheryl

    2017-11-01

    The extent of the association between temperature and heat-coded deaths, for which heat is the primary cause of death, remains largely unknown. We explored the association between temperature and heat-coded deaths and potential interactions with various demographic and environmental factors. A total of 335 heat-coded deaths that occurred in Oklahoma from 1990 through 2011 were identified using heat-related International Classification of Diseases codes, cause-of-death nomenclature, and narrative descriptions. Conditional logistic regression models examined the association between temperature and heat index on heat-coded deaths. Interaction by demographic factors (age, sex, marital status, living alone, outdoor/heavy labor occupations) and environmental factors (ozone, PM10, PM2.5) was also explored. Temperatures ≥99 °F (the median value) were associated with approximately five times higher odds of a heat-coded death as compared to temperatures effect estimates were attenuated when exposure to heat was characterized by heat index. The interaction results suggest that effect of temperature on heat-coded deaths may depend on sex and occupation. For example, the odds of a heat-coded death among outdoor/heavy labor workers exposed to temperatures ≥99 °F was greater than expected based on the sum of the individual effects (observed OR = 14.0, 95% CI 2.7, 72.0; expected OR = 4.1 [2.8 + 2.3-1.0]). Our results highlight the extent of the association between temperature and heat-coded deaths and emphasize the need for a comprehensive, multisource definition of heat-coded deaths. Furthermore, based on the interaction results, we recommend that states implement or expand heat safety programs to protect vulnerable subpopulations, such as outdoor workers.

  8. Methodology, status and plans for development and assessment of the code ATHLET

    Energy Technology Data Exchange (ETDEWEB)

    Teschendorff, V.; Austregesilo, H.; Lerchl, G. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH Forschungsgelaende, Garching (Germany)

    1997-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities.

  9. Methodology, status and plans for development and assessment of the code ATHLET

    International Nuclear Information System (INIS)

    Teschendorff, V.; Austregesilo, H.; Lerchl, G.

    1997-01-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities

  10. Calibration of the enigma code for Finnish reactor fuel with support from experimental irradiations

    Energy Technology Data Exchange (ETDEWEB)

    Kelppe, S; Ranta-Puska, K [VTT Energy, Jyvaeskylae (Finland)

    1997-08-01

    Assessment by VTT of the ENIGMA fuel performance code, the original version by Nuclear Electric plc of the UK amended by a set of WWER specific materials correlations, is described. The given examples of results include analyses for BWR 9 x 9 fuel, BWR fuel irradiated in the reinstrumented test of an international Riso project, pre-characterized commercial WWER fuel irradiated in Loviisa reactor in Finland, and instrumented WWER test fuel irradiations in the MR reactor in Russia. The effects of power uncertainty and some model parameters are discussed. Considering the fact that the described cases all mean prototypic application of the code, the results are well encouraging. The importance of the accuracy in temperature calculations is emphasized. (author). 2 refs, 12 figs, 1 tab.

  11. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  12. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  13. Impact of optical hard limiter on the performance of an optical overlapped-code division multiple access system

    Science.gov (United States)

    Inaty, Elie; Raad, Robert; Tablieh, Nicole

    2011-08-01

    Throughout this paper, a closed form expression of the multiple access interference (MAI) limited bit error rate (BER) is provided for the multiwavelength optical code-division multiple-access system when the system is working above the nominal transmission rate limit imposed by the passive encoding-decoding operation. This system is known in literature as the optical overlapped code division multiple access (OV-CDMA) system. A unified analytical framework is presented emphasizing the impact of optical hard limiter (OHL) on the BER performance of such a system. Results show that the performance of the OV-CDMA system may be highly improved when using OHL preprocessing at the receiver side.

  14. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  15. Use of Gray code in PBIL algorithm for application in recharge of nuclear fuels

    International Nuclear Information System (INIS)

    Nast, Fernando N.; Silva, Patrick V.; Meneses, Anderson A. M.; Schirru, Roberto

    2017-01-01

    The In-Core Fuel Management Optimization (OGCIN) problem, or design optimization of Load Patterns (PCs) are denominations for the optimization problem associated with the refueling operation in a reactor nuclear. The OCGIN is considered a problem of difficult resolution, considering aspects of combinatorial optimization and calculations of analysis and physics of reactors. In order to validate algorithms for the OGCIN solution, we use benchmark problems such as the Travelling Salesman Problem (TSP), because it is considered, like OGCIN, an NP-difficult problem. In the present work, we implemented the Population-Based Incremental Learning (PBIL) algorithm with binary coding and Gray coding and applied them to the optimization of the symmetric PCV Oliver30 and Rykel48 asymmetric PCV and implemented only the Gray coding in the OGCIN application of the cycle 7 of the Angra-1 Nuclear Plant, where we compared its performance with binary coding in. The results on average were 1311 and 1327 ppm of Boron for the binary and Gray codifications respectively, emphasizing that the binary codification obtained a maximum value of 1330 ppm, while the Gray code obtained a value of 1401 ppm, showing superiority, since the Boron concentration is an indicator of the PC cycle extension

  16. Differential mitochondrial DNA and gene expression in inherited retinal dysplasia in miniature Schnauzer dogs.

    Science.gov (United States)

    Appleyard, Greg D; Forsyth, George W; Kiehlbauch, Laura M; Sigfrid, Kristen N; Hanik, Heather L J; Quon, Anita; Loewen, Matthew E; Grahn, Bruce H

    2006-05-01

    To investigate the molecular basis of inherited retinal dysplasia in miniature Schnauzers. Retina and retinal pigment epithelial tissues were collected from canine subjects at the age of 3 weeks. Total RNA isolated from these tissues was reverse transcribed to make representative cDNA pools that were compared for differences in gene expression by using a subtractive hybridization technique referred to as representational difference analysis (RDA). Expression differences identified by RDA were confirmed and quantified by real-time reverse-transcription PCR. Mitochondrial morphology from leukocytes and skeletal muscle of normal and affected miniature Schnauzers was examined by transmission electron microscopy. RDA screening of retinal pigment epithelial cDNA identified differences in mRNA transcript coding for two mitochondrial (mt) proteins--cytochrome oxidase subunit 1 and NADH dehydrogenase subunit 6--in affected dogs. Contrary to expectations, these identified sequences did not contain mutations. Based on the implication of mt-DNA-encoded proteins by the RDA experiments we used real-time PCR to compare the relative amounts of mt-DNA template in white blood cells from normal and affected dogs. White blood cells of affected dogs contained less than 30% of the normal amount of two specific mtDNA sequences, compared with the content of the nuclear-encoded glyceraldehyde-3-phosphate dehydrogenase (GA-3-PDH) reference gene. Retina and RPE tissue from affected dogs had reduced mRNA transcript levels for the two mitochondrial genes detected in the RDA experiment. Transcript levels for another mtDNA-encoded gene as well as the nuclear-encoded mitochondrial Tfam transcription factor were reduced in these tissues in affected dogs. Mitochondria from affected dogs were reduced in number and size and were unusually electron dense. Reduced levels of nuclear and mitochondrial transcripts in the retina and RPE of miniature Schnauzers affected with retinal dysplasia suggest that

  17. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  18. Assessment of SPACE code for multiple failure accident: 1% Cold Leg Break LOCA with HPSI failure at ATLAS Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Hyuk; Lee, Seung Wook; Kim, Kyung-Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Design extension conditions (DECs) is a popular key issue after the Fukushima accident. In a viewpoint of the reinforcement of the defense in depth concept, a high-risk multiple failure accident should be reconsidered. The target scenario of ATLAS A5.1 test was LSTF (Large Scale Test Facility) SB-CL-32 test, a 1% SBLOCA with total failure of high pressure safety injection (HPSI) system of emergency core cooling system (ECCS) and secondary side depressurization as the accident management (AM) action, as a counterpart test. As the needs to prepare the DEC accident because of a multiple failure of the present NPPs are emphasized, the capability of SPACE code, just like other system analysis code, is required to expand the DEC area. The objectives of this study is to validate the capability of SPACE code for a DEC scenario, which represents multiple failure accident like as a SBLOCA with HPSI fail. Therefore, the ATLAS A5.1 test scenario was chosen. As the needs to prepare the DEC accident because of a multiple failure of operating NPPs are emphasized, the capability of SPACE code is needed to expand the DEC area. So the capability of SPACE code was validated for one of a DEC scenario. The target scenario was selected as the ATLAS A5.1 test, which is a 1% SBLOCA with total failure of HPSI system of ECCS and secondary side depressurization. Through the sensitivity study on discharge coefficient of break flow, the best fit of integrated mass was found. Using the coefficient, the ATLAS A5.1 test was analyzed using the SPACE code. The major thermal hydraulic parameters such as the system pressure, temperatures were compared with the test and have a good agreement. Through the simulation, it was concluded that the SPACE code can effectively simulate one of multiple failure accidents like as SBLOCA with HPSI failure accident.

  19. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  20. Unintended Effects of Emphasizing Disparities in Cancer Communication to African-Americans

    OpenAIRE

    Nicholson, Robert A.; Kreuter, Matthew W.; Lapka, Christina; Wellborn, Rachel; Clark, Eddie M.; Sanders-Thompson, Vetta; Jacobsen, Heather M.; Casey, Chris

    2008-01-01

    Little is known about how minority groups react to public information that highlights racial disparities in cancer. This double-blind randomized study compared emotional and behavioral reactions to four versions of the same colon cancer (CRC) information presented in mock news articles to a community sample of African-American adults (n = 300). Participants read one of four articles that varied in their framing and interpretation of race-specific CRC mortality data, emphasizing impact (CRC is...

  1. Bone mineral density in patients with destructive arthrosis of the hip joint.

    Science.gov (United States)

    Okano, Kunihiko; Aoyagi, Kiyoshi; Enomoto, Hiroshi; Osaki, Makoto; Chiba, Ko; Yamaguchi, Kazumasa

    2014-05-01

    Recent reports have shown the existence of subchondral insufficiency fracture in rapidly destructive arthrosis of the hip joint (RDA), and the findings suggest that osteopenia is related to the pathogenesis of the rapid progression of this disease. Therefore, we measured bone mineral density (BMD) in RDA patients. We measured BMD of the lumbar spine, radius, and calcaneus using dual-energy X-ray absorptiometry in 19 patients with RDA and 75 with osteoarthritis of the hip (OA) and compared BMD at different skeletal sites between RDA and OA patients. No significant differences were observed in BMD of the lumbar spine, ultradistal radius, mid-radius, and calcaneous between the RDA and OA groups. Our data suggest that RDA is not accompanied by generalized osteoporosis. Factors other than generalized bone status, for example, BMD around the affected hip joint before destruction, need to be analyzed to elucidate the pathophysiological mechanism of RDA.

  2. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  3. Building-up domestic enrichment capacity is emphasized

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    This is an interim report presented by the Nuclear Fuel Cycle Committee of the Japanese Atomic Energy Commission to recommend adequate policy lines in each field of fuel cycle. As for the procurement of natural uranium, advises are given for both state authorities and for private companies (including electric utilities) on the basis of the ''develop-and-import'' policy. As for the procurement of enriched uranium, the urgency of the development of enrichment techniques and plants is emphasized together with the important role of the Power Reactor and Nuclear Fuel Development Corporation (PNC). As for reprocessing, it is confirmed that the construction and operation of the second reprocessing plant should be undertaken by private interests. The state authorities are advised to undertake revisions of the relevant laws, regulations, and standards. As for plutonium recycle, the demonstration of the use of plutonium with Fugen type heavy water reactors, as well as light water reactors is encouraged. As for the radiative waste disposal, advices associated with experimental ocean dumping, solidification, storage, and geological disposal are given. Finally, as for spent fuel transportation, problems associate with the physical protection and the safety of spent fuel transportation are treated. (Aoki, K.)

  4. IAEA provisional code of practice on management of radioactive waste from nuclear power plants

    International Nuclear Information System (INIS)

    1982-10-01

    This Code of Practice defines the minimum requirements for operations and design of structures, systems and components important for management of wastes from thermal nuclear power plants. It emphasizes what safety requirements shall be met rather than specifies how these requirements can be met; the latter aspect is covered in Safety Guides. The Code defines the need for a Government to assume responsibility for regulating waste management practices in conjunction with the regulation of a nuclear power plant. The Code does not prejudge the organization of the regulatory authority, which may differ from one Member State to another, and may involve more than one body. Similarly, the Code does not deal specifically with the functions of a regulatory authority responsible for such matters, although it may be of value to Member States in providing a basis for consideration of such functions. The Code deals with the entire management system for all wastes from nuclear power plants embodying thermal reactors including PWR, BWR, HWR and HTGR technologies. Topics included are: design, normal and abnormal operation, and regulation of management systems for gaseous, liquid and solid wastes, including decommissioning wastes. The Code includes measures to be taken with regard to the wastes arising from spent fuel management at nuclear power plants. However, the options for further management of spent fuel are only outlined since it is the subject of decisions by individual Member States. The Code does not require that an option(s) be decided upon prior to construction or operation of a nuclear power plant

  5. Ion channeling study of defects in compound crystals using Monte Carlo simulations

    Science.gov (United States)

    Turos, A.; Jozwik, P.; Nowicki, L.; Sathish, N.

    2014-08-01

    Ion channeling is a well-established technique for determination of structural properties of crystalline materials. Defect depth profiles have been usually determined basing on the two-beam model developed by Bøgh (1968) [1]. As long as the main research interest was focused on single element crystals it was considered as sufficiently accurate. New challenge emerged with growing technological importance of compound single crystals and epitaxial heterostructures. Overlap of partial spectra due to different sublattices and formation of complicated defect structures makes the two beam method hardly applicable. The solution is provided by Monte Carlo computer simulations. Our paper reviews principal aspects of this approach and the recent developments in the McChasy simulation code. The latter made it possible to distinguish between randomly displaced atoms (RDA) and extended defects (dislocations, loops, etc.). Hence, complex defect structures can be characterized by the relative content of these two components. The next refinement of the code consists of detailed parameterization of dislocations and dislocation loops. Defect profiles for variety of compound crystals (GaN, ZnO, SrTiO3) have been measured and evaluated using the McChasy code. Damage accumulation curves for RDA and extended defects revealed non monotonous defect buildup with some characteristic steps. Transition to each stage is governed by the different driving force. As shown by the complementary high resolution XRD measurements lattice strain plays here the crucial role and can be correlated with the concentration of extended defects.

  6. Safe Operation of Nuclear Power Plants. Code of Practice and Technical Appendices

    International Nuclear Information System (INIS)

    1969-01-01

    This book is in two parts. The first is a Code of Practice for the Safe Operation of Nuclear Power Plants and the second part is a compilation of technical appendices. Its object is to give information and illustrative examples that would be helpful in implementing the Code of Practice. This second part, although published under the same cover, is not part of the Code. Safe operation of a nuclear power plant postulates suitable siting and proper design, construction and management of the plant. Under the present Code of Practice for the Safe Operation of Nuclear Power Plants, those intending to operate the plant are recommended to prepare documentation which would deal with its operation and include safety analyses. The documentation in question would be reviewed by a regulatory body independent of the operating organization; operation would be authorized on the understanding that it would comply with limits and conditions designed to ensure safety. The Code may be subject to revision in the light of experience. The Appendices provide additional information together with some examples relating to certain topics dealt with in the Code; it must be emphasized that they are included as examples for information only and are not part of any recommendation. Purpose and scope: The recommendations in the Code are designed to protect the general public and the operating personnel from radiation hazards, and the Code forms part of the Agency's Safety Standards. The Code, which should be used in conjunction with the Agency's other Safety Standards, provides guidance and information to persons and authorities responsible for the operation of stationary nuclear power plants whose main function is the generation of thermal, mechanical or electrical power; it is not intended to apply to reactors used solely for experimental or research purposes. It sets forth minimum requirements which, it is believed, in the light of experience, must be met in order to achieve safe operation of a

  7. Effects of messages emphasizing environmental determinants of obesity on intentions to engage in diet and exercise behaviors.

    Science.gov (United States)

    Niederdeppe, Jeff; Roh, Sungjong; Shapiro, Michael A; Kim, Hye Kyung

    2013-12-12

    Reducing rates of obesity will require interventions that influence both individual decisions and environmental factors through changes in public policy. Previous work indicates that messages emphasizing environmental determinants increases support for public policies, but some suspect this strategy may undermine motivation to engage in diet and exercise. Study 1 involved 485 adults recruited from a shopping mall in New York. Study 2 involved 718 adult members of a Web-based national panel of US adults. Respondents in both studies were randomly assigned to read a story that emphasized environmental determinants of health or a control condition. The stories varied in the extent to which they described the story character as taking personal responsibility for weight management. Logistic regression and ordered logit models were used to test for differences in intentions to engage in diet and exercise behaviors based on which story the participant read. Analyses were also performed separately by participants' weight status. In both studies, messages that acknowledged personal responsibility while emphasizing environmental causes of obesity increased intentions to engage in healthy behavior for at least 1 weight status group. Emphasizing factors outside of personal control appears to enhance rather than undermine motivations to engage in healthy diet and exercise behavior.

  8. RADTRAN 5 - A computer code for transportation risk analysis

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1993-01-01

    The RADTRAN 5 computer code has been developed to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI standard FORTRAN 77; the code contains significant advances in the methodology first pioneered with the LINK option of RADTRAN 4. A major application of the LINK methodology is route-specific analysis. Another application is comparisons of attributes along the same route segments. Nonradiological risk factors have been incorporated to allow users to estimate nonradiological fatalities and injuries that might occur during the transportation event(s) being analyzed. These fatalities include prompt accidental fatalities from mechanical causes. Values of these risk factors for the United States have been made available in the code as optional defaults. Several new health effects models have been published in the wake of the Hiroshima-Nagasaki dosimetry reassessment, and this has emphasized the need for flexibility in the RADTRAN approach to health-effects calculations. Therefore, the basic set of health-effects conversion equations in RADTRAN have been made user-definable. All parameter values can be changed by the user, but a complete set of default values are available for both the new International Commission on Radiation Protection model (ICRP Publication 60) and the recent model of the U.S. National Research Council's Committee on the Biological Effects of Radiation (BEIR V). The meteorological input data tables have been modified to permit optional entry of maximum downwind distances for each dose isopleth. The expected dose to an individual in each isodose area is also calculated and printed automatically. Examples are given that illustrate the power and flexibility of the RADTRAN 5 computer code. (J.P.N.)

  9. Analyzing the BWR rod drop accident in high-burnup cores

    International Nuclear Information System (INIS)

    Diamond, D.J.; Neymotin, L.; Kohut, P.

    1995-01-01

    This study was undertaken for the US Nuclear Regulatory Commission to determine the fuel enthalpy during a rod drop accident (RDA) for cores with high burnup fuel. The calculations were done with the RAMONA-4B code which models the core with 3-dimensional neutron kinetics and multiple parallel coolant channels. The calculations were done with a model for a BWR/4 with fuel bundles having burnups up to 30 GWd/t and also with a model with bundle burnups to 60 GWd/t. This paper also discusses potential sources of uncertainty in calculations with high burnup fuel. One source is the ''rim'' effect which is the extra large peaking of the power distribution at the surface of the pellet. This increases the uncertainty in reactor physics and heat conduction models that assume that the energy deposition has a less peaked spatial distribution. Two other sources of uncertainty are the result of the delayed neutron fraction decreasing with burnup and the positive moderator temperature feedback increasing with burnup. Since these effects tend to increase the severity of the event, an RDA calculation for high burnup fuel will underpredict the fuel enthalpy if the effects are not properly taken into account. Other sources of uncertainty that are important come from the initial conditions chosen for the RDA. This includes the initial control rod pattern as well as the initial thermal-hydraulic conditions

  10. [The emphases and basic procedures of genetic counseling in psychotherapeutic model].

    Science.gov (United States)

    Zhang, Yuan-Zhi; Zhong, Nanbert

    2006-11-01

    The emphases and basic procedures of genetic counseling are all different with those in old models. In the psychotherapeutic model, genetic counseling will not only focus on counselees' genetic disorders and birth defects, but also their psychological problems. "Client-centered therapy" termed by Carl Rogers plays an important role in genetic counseling process. The basic procedures of psychotherapeutic model of genetic counseling include 7 steps: initial contact, introduction, agendas, inquiry of family history, presenting information, closing the session and follow-up.

  11. A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

    Energy Technology Data Exchange (ETDEWEB)

    C. O. Slater; J. M. Barnes; J. O. Johnson; J.D. Drischler

    1998-10-01

    The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.

  12. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  13. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  14. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  15. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  16. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  17. Nupack, the new ASME code for radioactive material transportation packaging containments

    International Nuclear Information System (INIS)

    Turula, P.

    1998-01-01

    The American Society of Mechanical Engineers (ASME) has added a new division to the nuclear construction section of its Boiler and Pressure Vessel Code (B and PVC). This Division, commonly referred to as Nupack, has been written to provide a consistent set of technical requirements for containment vessels of transportation packagings for high-level radioactive materials. This paper provides an introduction to Nupack, discusses some of its technical provisions, and describes how it can be used for the design and construction of packaging components. Nupack's general provisions and design requirements are emphasized, while treatment of materials, fabrication and inspection is left for another paper

  18. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  19. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  20. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  1. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  2. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  3. The associative forms in Romania following the new Civil Code, republished in 2011

    Directory of Open Access Journals (Sweden)

    Angela MIFF

    2013-12-01

    Full Text Available During time, the association has evolved as a form of socio-economic organisation in order to perform non-professional or, by case, professional activities. The legislative sources have emphasized, in time, the variety of the ways of manifestation of the association among different law subjects – physical and/or legal persons. The new Civil Code (2009, republished in 2011, in force since the 1st October 2011, fundamented on the monist approach of regulation, as the common-law norm for all the domains that the letter and the spirit of its provisions refer to, regulates the contract of association, in chapter VII of the 5thBook; apart from the general norms applicable to all such contracts of association, the present code replaces the former civil society without legal personality with the present simple society and, also as a novelty element, transposes the regulation of the silent partnership from the former framework of the Commercial Code (1887, abrogated almost in totality in the section 3 of the same chapter VII, the 5th Book of the code. The elements that are similar with the former regulation outline the continuity aspects in the conception of these juridical institutions in a modern approach that transposes aspects which were clarified by the jurisprudence or the legal doctrine.

  4. Obama Emphasizes Science and Innovation in State of the Union Address

    Science.gov (United States)

    Tretkoff, Ernie

    2011-02-01

    U.S. president Barack Obama emphasized innovation and competitiveness in his State of the Union address on 25 January. He also raised science and technology early in the hour-long speech, noting that nations like China and India are focusing on math and science education and investing in research and technology. To be competitive with those countries, “we need to out-innovate, out-educate, and out-build the rest of the world,” Obama said. “The first step in winning the future is encouraging American innovation.”

  5. DOI: 10.18697/ajfand.80.16730 12854 RISK FACTORS ...

    African Journals Online (AJOL)

    This study forms a baseline for research to develop a strategy for preventing .... energy (50.3% of RDA), vitamin A (71.7% of RDA) and iron (52.6% of RDA) were high. ..... Standard Monitoring and Relief in Transition (SMART), 2011. 12.

  6. "There are too many, but never enough": qualitative case study investigating routine coding of clinical information in depression.

    Science.gov (United States)

    Cresswell, Kathrin; Morrison, Zoe; Kalra, Dipak; Sheikh, Aziz

    2012-01-01

    We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined.

  7. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  8. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  9. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  10. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  11. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  12. Thyc, a 3D thermal-hydraulic code for rod bundles. Recent developments and validation tests

    International Nuclear Information System (INIS)

    Caremoli, C.; Rascle, P.; Aubry, S.; Olive, J.

    1993-09-01

    PWR or LMFBR cores or fuel assemblies, PWR steam generators, condensers, tubular heat exchangers, are basic components of a nuclear power plant involving two-phase flows in tube or rod bundles. A deep knowledge of the detailed flow patterns on the shell side is necessary to evaluate DNB margins in reactor cores, singularity effects (grids, wire spacers, support plates, baffles), corrosion on steam generator tube sheet, bypass effects and vibration risks. For that purpose, Electricite de France has developed, since 1986, a general purpose code named THYC (Thermal HYdraulic Code) designed to study three-dimensional single and two phase flows in rod or tube bundles (pressurized water reactor cores, steam generators, condensers, heat exchangers). It considers the three-dimensional domain to contain two kinds of components: fluid and solids. The THYC model is obtained by space-time averaging of the instantaneous equations (mass, momentum and energy) of each phase over control volumes including fluid and solids. This paper briefly presents the physical model and the numerical method used in THYC. Then, validation tests (comparison with experiments) and applications (coupling with three-dimensional neutronics code and DNB predictions) are presented. They emphasize the last developments and new capabilities of the code. (authors). 10 figs., 3 tabs., 21 refs

  13. Measuring intergroup ideologies: positive and negative aspects of emphasizing versus looking beyond group differences.

    Science.gov (United States)

    Hahn, Adam; Banchefsky, Sarah; Park, Bernadette; Judd, Charles M

    2015-12-01

    Research on interethnic relations has focused on two ideologies, asking whether it is best to de-emphasize social-category differences (colorblind) or emphasize and celebrate differences (multicultural). We argue each of these can manifest with negative outgroup evaluations: Assimilationism demands that subordinate groups adopt dominant group norms to minimize group distinctions; segregationism holds that groups should occupy separate spheres. Parallel versions can be identified for intergender relations. Scales to measure all four ideologies are developed both for ethnicity (Studies 1 and 2) and gender (Studies 3 and 4). Results demonstrate that the ideologies can be reliably measured, that the hypothesized four-factor models are superior to alternative models with fewer factors, and that the ideologies relate as predicted to the importance ascribed to group distinctions, subordinate group evaluations, and solution preferences for intergroup conflict scenarios. We argue that this fourfold model can help clarify theory and measurement, allowing a more nuanced assessment of ideological attitudes. © 2015 by the Society for Personality and Social Psychology, Inc.

  14. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2015-09-01

    Full Text Available In recent years, schools (as well as universities have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to explore differences in both populations’ learning, we compared measures of software quality and security between high-school teachers and students. We collected 109 source files, written in Python by 18 teachers and 31 students, and engineered 32 features, based on common standards for software quality (PEP 8 and security (derived from CERT Secure Coding Standards. We use a multi-view, data-driven approach, by (a using hierarchical clustering to bottom-up partition the population into groups based on their code-related features and (b building a decision tree model that predicts whether a student or a teacher wrote a given code (resulting with a LOOCV kappa of 0.751. Overall, our findings suggest that the teachers’ codes have a better quality than the students’ – with a sub-group of the teachers, mostly males, demonstrate better coding than their peers and the students – and that the students’ codes are slightly better secured than the teachers’ codes (although both populations show very low security levels. The findings imply that teachers might benefit from their prior knowledge and experience, but also emphasize the lack of continuous involvement of some of the teachers with code-writing. Therefore, findings shed light on computer science teachers as lifelong learners. Findings also highlight the difference between quality and security in today’s programming paradigms. Implications for these findings are discussed.

  15. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  16. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  17. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  18. User's manual to the ICRP Code: a series of computer programs to perform dosimetric calculations for the ICRP Committee 2 report

    International Nuclear Information System (INIS)

    Watson, S.B.; Ford, M.R.

    1980-02-01

    A computer code has been developed that implements the recommendations of ICRP Committee 2 for computing limits for occupational exposure of radionuclides. The purpose of this report is to describe the various modules of the computer code and to present a description of the methods and criteria used to compute the tables published in the Committee 2 report. The computer code contains three modules of which: (1) one computes specific effective energy; (2) one calculates cumulated activity; and (3) one computes dose and the series of ICRP tables. The description of the first two modules emphasizes the new ICRP Committee 2 recommendations in computing specific effective energy and cumulated activity. For the third module, the complex criteria are discussed for calculating the tables of committed dose equivalent, weighted committed dose equivalents, annual limit of intake, and derived air concentration

  19. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  20. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  1. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  2. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  3. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  4. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  6. Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics

    Science.gov (United States)

    Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.

  7. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  8. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  9. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  10. “There Are Too Many, but Never Enough": Qualitative Case Study Investigating Routine Coding of Clinical Information in Depression

    Science.gov (United States)

    Cresswell, Kathrin; Morrison, Zoe; Sheikh, Aziz; Kalra, Dipak

    2012-01-01

    Background We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. Materials and Methods Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. Results Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. Conclusions There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined. PMID:22937106

  11. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  12. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  13. Nupack, the new Asme code for radioactive material transportation packaging containments

    International Nuclear Information System (INIS)

    Turula, P.

    1998-01-01

    The American Society of Mechanical Engineers (ASME) has added a new division to the nuclear construction section of its Boiler and Pressure Vessel Code (B and PVC). This Division, commonly referred to as 'Nupack', has been written to provide a consistent set of technical requirements for containment vessels of transportation packagings for high-level radioactive materials. This paper provides an introduction to Nupack, discusses some of its technical provisions, and describes how it can be used the design and construction of packaging components. Nupack's general provisions and design requirements are emphasized, while treatment of materials, fabrication and inspection is left for another paper. Participation in the Nupack development work described in this paper was supported by the U.S. Department of Energy. (authors)

  14. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  15. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  16. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  17. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  18. Acknowledging individual responsibility while emphasizing social determinants in narratives to promote obesity-reducing public policy: a randomized experiment.

    Directory of Open Access Journals (Sweden)

    Jeff Niederdeppe

    Full Text Available This study tests whether policy narratives designed to increase support for obesity-reducing public policies should explicitly acknowledge individual responsibility while emphasizing social, physical, and economic (social determinants of obesity. We use a web-based, randomized experiment with a nationally representative sample of American adults (n = 718 to test hypotheses derived from theory and research on narrative persuasion. Respondents exposed to narratives that acknowledged individual responsibility while emphasizing obesity's social determinants were less likely to engage in counterargument and felt more empathy for the story's main character than those exposed to a message that did not acknowledge individual responsibility. Counterarguing and affective empathy fully mediated the relationship between message condition and support for policies to reduce rates of obesity. Failure to acknowledge individual responsibility in narratives emphasizing social determinants of obesity may undermine the persuasiveness of policy narratives. Omitting information about individual responsibility, a strongly-held American value, invites the public to engage in counterargument about the narratives and reduces feelings of empathy for a character that experiences the challenges and benefits of social determinants of obesity.

  19. Acknowledging individual responsibility while emphasizing social determinants in narratives to promote obesity-reducing public policy: a randomized experiment.

    Science.gov (United States)

    Niederdeppe, Jeff; Roh, Sungjong; Shapiro, Michael A

    2015-01-01

    This study tests whether policy narratives designed to increase support for obesity-reducing public policies should explicitly acknowledge individual responsibility while emphasizing social, physical, and economic (social) determinants of obesity. We use a web-based, randomized experiment with a nationally representative sample of American adults (n = 718) to test hypotheses derived from theory and research on narrative persuasion. Respondents exposed to narratives that acknowledged individual responsibility while emphasizing obesity's social determinants were less likely to engage in counterargument and felt more empathy for the story's main character than those exposed to a message that did not acknowledge individual responsibility. Counterarguing and affective empathy fully mediated the relationship between message condition and support for policies to reduce rates of obesity. Failure to acknowledge individual responsibility in narratives emphasizing social determinants of obesity may undermine the persuasiveness of policy narratives. Omitting information about individual responsibility, a strongly-held American value, invites the public to engage in counterargument about the narratives and reduces feelings of empathy for a character that experiences the challenges and benefits of social determinants of obesity.

  20. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  1. User's manual to the ICRP Code: a series of computer programs to perform dosimetric calculations for the ICRP Committee 2 report

    Energy Technology Data Exchange (ETDEWEB)

    Watson, S.B.; Ford, M.R.

    1980-02-01

    A computer code has been developed that implements the recommendations of ICRP Committee 2 for computing limits for occupational exposure of radionuclides. The purpose of this report is to describe the various modules of the computer code and to present a description of the methods and criteria used to compute the tables published in the Committee 2 report. The computer code contains three modules of which: (1) one computes specific effective energy; (2) one calculates cumulated activity; and (3) one computes dose and the series of ICRP tables. The description of the first two modules emphasizes the new ICRP Committee 2 recommendations in computing specific effective energy and cumulated activity. For the third module, the complex criteria are discussed for calculating the tables of committed dose equivalent, weighted committed dose equivalents, annual limit of intake, and derived air concentration.

  2. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  3. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  4. New quantum codes derived from a family of antiprimitive BCH codes

    Science.gov (United States)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  5. Surface acoustic wave coding for orthogonal frequency coded devices

    Science.gov (United States)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  6. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  7. KETERSEDIAAN HAYATI ZAT BESI, KANDUNGAN ZAT PEMICU DAN PENGHAMBAT PENYERAPAN ZAT BESI DALAM MAKANAN IBU HAMIL

    Directory of Open Access Journals (Sweden)

    M. Saidin

    2012-11-01

    Full Text Available The iron content, iron bioavailability (in vitro method, enhancers and inhibitors of iron absorption were investigated in three different staple foods of diets in the district of Boyolali, Central Java. The results revealed that the average iron content of the diets based on rice, corn and cassava were 18.8 mg, 17.8 mg and 19.9 mg, respectively or equal to 34.3%, 32.0% and 35.0% of Recommended Dietary Allowances (RDA for Indonesia. The average vitamin C content of the diets based on rice, corn and cassava were 21.9 mg (31.3% RDA, 21.1 mg (30.1% RDA and 17.3 mg (24.7% RDA, respectively. The average of protein content of the diets based on rice, corn and cassava were 47.1 g (78.5% RDA, 50.0 g (83.3% RDA and 31.1 g (51.8% RDA, respectively. The average content of tannic acid and phytic acid as inhibitors of iron absorption in the diets based on rice, corn and cassava were (1154 mg and 261.5 mg; (980 mg and 342.7 mg and (838 mg and 341.5 mg, respectively. An addition of 100 mg of vitamin C or papaya fruit (250 mg into the diets, increased iron bioavailability up to 54.2%. Keywords: iron bioavailability, enhancer agent, inhibitor agent, in vitro.

  8. BETWEEN DARKNESS AND SILENCE: THE RELATIONSHIP BETWEEN TICS AND DEAF-BLIND USING THE MORSE CODE TOOL

    Directory of Open Access Journals (Sweden)

    Calixto Júnior de Souza

    2017-10-01

    Full Text Available Based on the assumption that in the contemporaneity Information and Communication Technologies (ICTs have assumed a primordial function to the access of knowledge, it is necessary to think and reflect how these technologies can be treated to enhance the process of educational inclusion. Therefore, the objective of this study is to analyze the advances and / or regressions of the TICs for students with deafblindness, based on the use of the Morse code tool. Thus, it is important to emphasize that the students with deafblindness is not considered as having multiple disabilities, but rather as subjects that have a limitation in the combination of the visual and auditory senses. For this study, therefore, the use of Morse code will enable the teacher to potentiate teaching and learning processes in order to involve, mediate and include students with deafblindness through their educational needs and potentialities. Thus, the limitations of these two senses can be compensated for and strengthened by the Morse code tool in order to foster communication between the student with deafblindness and other people, in a constant process of knowledge exchange.

  9. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  10. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  11. Quantum Codes From Cyclic Codes Over The Ring R 2

    International Nuclear Information System (INIS)

    Altinel, Alev; Güzeltepe, Murat

    2016-01-01

    Let R 2 denotes the ring F 2 + μF 2 + υ 2 + μυ F 2 + wF 2 + μwF 2 + υwF 2 + μυwF 2 . In this study, we construct quantum codes from cyclic codes over the ring R 2 , for arbitrary length n, with the restrictions μ 2 = 0, υ 2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R 2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R 2 and we give an example of quantum error-correcting codes form cyclic codes over R 2 . (paper)

  12. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  13. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    Science.gov (United States)

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  14. La norma RDA: Recursos, Descripción y Acceso y la adaptación al cambio en los sistemas bibliográficos en España.

    Directory of Open Access Journals (Sweden)

    Maria Osuna Alarcón

    2015-06-01

    Full Text Available Cuando ha transcurrido un año de la implementación de la norma RDA en EEUU, consideramos necesario la actualización del tema para entender el momento en que nos encontramos. Esta norma es un gran cambio en la catalogación de documentos y la recuperación de información de la biblioteca vía web. A través del contacto con los profesionales responsables de su implantación constatamos que el éxito de su aplicación en España depende de cómo se gestione el cambio. Para ello es imprescindible la coordinación entre los diferentes grupos de bibliotecas españolas. La coordinación para su implantación pasa en primer lugar por la creación de un consorcio de gestión de la inversión que sería un elemento esencial para obtener resultados inmediatos. El éxito sería más o menos visible, costoso y rápido de pendiendo de la coordinación en su implementación. La clave está en cómo se gestione la innovación por el Sistema Integrado de Gestión Bibliotecaria. La adaptación de los actuales sistemas puede generar un coste importante para los presupuestos de nuestras bibliotecas. Muchas bibliotecas preferirán cambiar de sistema, otras tendrán que asumir los nuevos costes de las actualizaciones, etc. La realidad es que una posición fuerte y unida por parte de los usuarios potenciales permitirá ahorrar en el cambio de modelo tecnológico.

  15. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  16. Ethics and skeptics: what lies behind ethical codes in occupational health.

    Science.gov (United States)

    Guidotti, Tee L

    2005-02-01

    Ethical codes, or systems, are conditioned and their enforcement is permitted by social processes and attitudes. In occupational health, our efforts to adhere to our own ethical frameworks often are undermined by forces and interests outside the field. Failure to acknowledge the profoundly social nature of ethical codes impedes our ability to anticipate consequences, to legitimate decisions based on utility and benefit, and to find social structures that support, rather than invalidate, our view of ethical behavior. We examine three sets of social philosophies. Jane Jacobs, the visionary urban planner, has written Systems of Survival: A Dialogue on the Moral Foundations of Commerce and Politics, which is a restatement in modern terms of a critical passage in Plato's most important dialogue, the Republic. She (and Plato) postulate two major ethical systems, renamed here the "guardian system," which is characterized by loyalty, cohesiveness, and confidentiality, and the "marketplace system," which is characterized by trade, decentralization, and shared information. Occupational health, in this formulation, often runs afoul of the guardian mentality and also may be subject to inappropriate negotiation and compromise in the marketplace system. George Lakoff, a semiotician, has written Moral Politics: What Conservatives Know That Liberals Don't, which argues that there are two fundamental social paradigms based on concepts of the family. One, which he calls the Strict Father, emphasizes discipline, the positive aspects of taking risks, and the need to individuals to be self-sufficient. The other, which he calls the Nurturing Parent, emphasizes empowerment, the positive aspects of security, and the need for community and relationships. Occupational health practice violates aspects of both and therefore is supported by neither. Classical Chinese thought involved many schools of thought, including Confucianism and Legalism. It has been suggested that Confucianism provides

  17. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  18. Some Families of Asymmetric Quantum MDS Codes Constructed from Constacyclic Codes

    Science.gov (United States)

    Huang, Yuanyuan; Chen, Jianzhang; Feng, Chunhui; Chen, Riqing

    2018-02-01

    Quantum maximal-distance-separable (MDS) codes that satisfy quantum Singleton bound with different lengths have been constructed by some researchers. In this paper, seven families of asymmetric quantum MDS codes are constructed by using constacyclic codes. We weaken the case of Hermitian-dual containing codes that can be applied to construct asymmetric quantum MDS codes with parameters [[n,k,dz/dx

  19. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  20. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  1. Fuel relocation modeling in the SAS4A accident analysis code system

    International Nuclear Information System (INIS)

    Tentner, A.M.; Miles, K.J.; Kalimullah; Hill, D.J.

    1986-01-01

    The SAS4A code system has been designed for the analysis of the initial phase of Hypothetical Core Disruptive Accidents (HCDAs) up to gross melting or failure of the subassembly walls. During such postulated accident scenarios as the Loss-of-Flow (LOF) and Transient-Overpower (TOP) events, the relocation of the fuel plays a key role in determining the sequence of events and the amount of energy produced before neutronic shutdown. This paper discusses the general strategy used in modelong the various phenomena which lead to fuel relocation and presents the key fuel relocation models used in SAS4A. The implications of these models for the whole-core accident analysis as well as recent results of fuel relocation are emphasized. 12 refs

  2. Identification and characterization of a phytoestrogen-specific gene from the MCF-7 human breast cancer cell

    International Nuclear Information System (INIS)

    Ramanathan, Lakshmi; Gray, Wesley G.

    2003-01-01

    Phytoestrogens are a group of compounds present in human diet that display estrogenic-like properties. Several studies have demonstrated that populations who consume large quantities of phytoestrogens have a reduced risk of estrogen-dependent cancers. Although it has been shown that certain phytoestrogens modulate estrogen action, their biological role in cancer reduction remains unclear. Through the use of differential display reverse transcriptase-polymerase chain reaction and representational difference analysis of cDNA, we have identified several phytoestrogen-responsive genes from the human breast cancer cell MCF-7. Two of these genes, PE-13.1 and pRDA-D, have been characterized in greater detail in this study. These genes were not previously known to be regulated by phytoestrogen or estradiol. PE-13.1 is a novel gene that specifies the coding of a 1.10-kb mRNA transcript. Northern blot analysis confirmed that the PE-13.1 transcript is up-regulated by phytoestrogens (Genistein, sevenfold; Zearalenone, twofold) and is nonresponsive to estradiol. Conversely, the pRDA-D transcript was down-regulated by both phytoestrogens and estradiol. The antiestrogen ICI-182,780 inhibits the expression of PE-13.1 and reverses the inhibition of pRDA-D expression induced by phytoestrogens and estradiol. Analysis of the tissue distribution of PE-13.1 transcript by RNA blot reveals that this transcript is expressed in both normal and tumor tissues. This report demonstrates for the first time the presence of two phytoestrogen-responsive genes that may be used as molecular markers in understanding the role dietary estrogen plays in cancer prevention

  3. Bovine viral diarrhea (BVD: A review emphasizing on Iran perspective

    Directory of Open Access Journals (Sweden)

    Mohammad Khezri

    2015-09-01

    Full Text Available Bovine viral diarrhea (BVD is one of the most important diseases of cattle responsible for major economic losses in dairy industries of Iran. So far, no nationwide program has been taken in Iran to control and eradicate the disease. Moreover, until now, no vaccination program has been practiced against BVD in Iran, although the disease is prevailing in the country. For effective controlling of BVD, it is necessary to cull the affected animals, and new entry of BVD in the farm should be prevented. Focusing on biosecurity in systematic control programs of BVD can also reduce the risks of introduction and spread of other epizootic and zoonotic diseases, thereby improving both cattle health and welfare in general. In this review paper, an overview on BVD emphasizing on Iran perspective has been discussed focusing on clinical manifestations of BVD, routes of transmission of BVD virus (BVDV, its diagnostic methods and possible prevention strategies. [J Adv Vet Anim Res 2015; 2(3.000: 240-251

  4. Local gene regulation details a recognition code within the LacI transcriptional factor family.

    Directory of Open Access Journals (Sweden)

    Francisco M Camas

    2010-11-01

    Full Text Available The specific binding of regulatory proteins to DNA sequences exhibits no clear patterns of association between amino acids (AAs and nucleotides (NTs. This complexity of protein-DNA interactions raises the question of whether a simple set of wide-coverage recognition rules can ever be identified. Here, we analyzed this issue using the extensive LacI family of transcriptional factors (TFs. We searched for recognition patterns by introducing a new approach to phylogenetic footprinting, based on the pervasive presence of local regulation in prokaryotic transcriptional networks. We identified a set of specificity correlations--determined by two AAs of the TFs and two NTs in the binding sites--that is conserved throughout a dominant subgroup within the family regardless of the evolutionary distance, and that act as a relatively consistent recognition code. The proposed rules are confirmed with data of previous experimental studies and by events of convergent evolution in the phylogenetic tree. The presence of a code emphasizes the stable structural context of the LacI family, while defining a precise blueprint to reprogram TF specificity with many practical applications.

  5. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  6. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  7. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  8. CodeArmor : Virtualizing the Code Space to Counter Disclosure Attacks

    NARCIS (Netherlands)

    Chen, Xi; Bos, Herbert; Giuffrida, Cristiano

    2017-01-01

    Code diversification is an effective strategy to prevent modern code-reuse exploits. Unfortunately, diversification techniques are inherently vulnerable to information disclosure. Recent diversification-aware ROP exploits have demonstrated that code disclosure attacks are a realistic threat, with an

  9. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  10. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  11. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    Science.gov (United States)

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p code (odds ratio 310.0, p coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  12. Estimated Nutritive Value of Low-Price Model Lunch Sets Provided to Garment Workers in Cambodia

    Directory of Open Access Journals (Sweden)

    Jan Makurat

    2017-07-01

    Full Text Available Background: The establishment of staff canteens is expected to improve the nutritional situation of Cambodian garment workers. The objective of this study is to assess the nutritive value of low-price model lunch sets provided at a garment factory in Phnom Penh, Cambodia. Methods: Exemplary lunch sets were served to female workers through a temporary canteen at a garment factory in Phnom Penh. Dish samples were collected repeatedly to examine mean serving sizes of individual ingredients. Food composition tables and NutriSurvey software were used to assess mean amounts and contributions to recommended dietary allowances (RDAs or adequate intake of energy, macronutrients, dietary fiber, vitamin C (VitC, iron, vitamin A (VitA, folate and vitamin B12 (VitB12. Results: On average, lunch sets provided roughly one third of RDA or adequate intake of energy, carbohydrates, fat and dietary fiber. Contribution to RDA of protein was high (46% RDA. The sets contained a high mean share of VitC (159% RDA, VitA (66% RDA, and folate (44% RDA, but were low in VitB12 (29% RDA and iron (20% RDA. Conclusions: Overall, lunches satisfied recommendations of caloric content and macronutrient composition. Sets on average contained a beneficial amount of VitC, VitA and folate. Adjustments are needed for a higher iron content. Alternative iron-rich foods are expected to be better suited, compared to increasing portions of costly meat/fish components. Lunch provision at Cambodian garment factories holds the potential to improve food security of workers, approximately at costs of <1 USD/person/day at large scale. Data on quantitative total dietary intake as well as physical activity among workers are needed to further optimize the concept of staff canteens.

  13. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  14. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  15. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  16. Chromatin Regulation and the Histone Code in HIV Latency
.

    Science.gov (United States)

    Turner, Anne-Marie W; Margolis, David M

    2017-06-01

    The formation of a latent reservoir of Human Immunodeficiency Virus (HIV) infection hidden from immune clearance remains a significant obstacle to approaches to eradicate HIV infection. Towards an understanding of the mechanisms of HIV persistence, there is a growing body of work implicating epigenetic regulation of chromatin in establishment and maintenance of this latent reservoir. Here we discuss recent advances in the field of chromatin regulation, specifically in our understanding of the histone code, and how these discoveries relate to our current knowledge of the chromatin mechanisms linked to HIV transcriptional repression and the reversal of latency. We also examine mechanisms unexplored in the context of HIV latency and briefly discuss current therapies aimed at the induction of proviral expression within latently infected cells. We aim to emphasize that a greater understanding of the epigenetic mechanisms which govern HIV latency could lead to new therapeutic targets for latency reversal and clearance cure strategies.

  17. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  18. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  19. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  20. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  1. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  2. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  3. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  4. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  5. Digital filtering and reconstruction of coded aperture images

    International Nuclear Information System (INIS)

    Tobin, K.W. Jr.

    1987-01-01

    The real-time neutron radiography facility at the University of Virginia has been used for both transmission radiography and computed tomography. Recently, a coded aperture system has been developed to permit the extraction of three dimensional information from a low intensity field of radiation scattered by an extended object. Short wave-length radiations (e.g. neutrons) are not easily image because of the difficulties in achieving diffraction and refraction with a conventional lens imaging system. By using a coded aperture approach, an imaging system has been developed that records and reconstructs an object from an intensity distribution. This system has a signal-to-noise ratio that is proportional to the total open area of the aperture making it ideal for imaging with a limiting intensity radiation field. The main goal of this research was to develope and implement the digital methods and theory necessary for the reconstruction process. Several real-time video systems, attached to an Intellect-100 image processor, a DEC PDP-11 micro-computer, and a Convex-1 parallel processing mainframe were employed. This system, coupled with theoretical extensions and improvements, allowed for retrieval of information previously unobtainable by earlier optical methods. The effect of thermal noise, shot noise, and aperture related artifacts were examined so that new digital filtering techniques could be constructed and implemented. Results of image data filtering prior to and following the reconstruction process are reported. Improvements related to the different signal processing methods are emphasized. The application and advantages of this imaging technique to the field of non-destructive testing are also discussed

  6. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  7. Simulation of nonlinear propagation of biomedical ultrasound using pzflex and the Khokhlov-Zabolotskaya-Kuznetsov Texas code.

    Science.gov (United States)

    Qiao, Shan; Jackson, Edward; Coussios, Constantin C; Cleveland, Robin O

    2016-09-01

    Nonlinear acoustics plays an important role in both diagnostic and therapeutic applications of biomedical ultrasound and a number of research and commercial software packages are available. In this manuscript, predictions of two solvers available in a commercial software package, pzflex, one using the finite-element-method (FEM) and the other a pseudo-spectral method, spectralflex, are compared with measurements and the Khokhlov-Zabolotskaya-Kuznetsov (KZK) Texas code (a finite-difference time-domain algorithm). The pzflex methods solve the continuity equation, momentum equation and equation of state where they account for nonlinearity to second order whereas the KZK code solves a nonlinear wave equation with a paraxial approximation for diffraction. Measurements of the field from a single element 3.3 MHz focused transducer were compared with the simulations and there was good agreement for the fundamental frequency and the harmonics; however the FEM pzflex solver incurred a high computational cost to achieve equivalent accuracy. In addition, pzflex results exhibited non-physical oscillations in the spatial distribution of harmonics when the amplitudes were relatively low. It was found that spectralflex was able to accurately capture the nonlinear fields at reasonable computational cost. These results emphasize the need to benchmark nonlinear simulations before using codes as predictive tools.

  8. [Assessment of efficiency of dietotherapy with addition of a vitamin-mineral complex in patients with diabetes mellitus type 2].

    Science.gov (United States)

    Lapik, I A; Sokol'nikov, A A; Sharafetdinov, Kh Kh; Sentsova, T B; Plotnikova, O A

    2014-01-01

    The influence of diet inclusion of vitamin and mineral complex (VMC), potassium and magnesium in the form of asparaginate on micronutrient status, body composition and biochemical parameters in patients with diabetes mellitus type 2 (DM2) has been investigated. 120 female patients with DM2 and obesity of I-III degree (mean age - 58 +/- 6 years) have been included in the study. The patients were divided into two groups: main group (n = 60) and control group (n = 60). For 3 weeks patients of both groups received a low-calorie diet (1600 kcal/day). Patients of the main group received VMC, providing an additional intake of vitamins C and E (100-120% RDA), beta-carotene (40% RDA), nicotinamide (38% RDA), pantothenic acid and biotin (60% RDA), vitamins B12, B2 and folic acid (75-83% RDA), vitamins B1 and B6 (160-300% RDA), zinc (100% RDA) and chromium (400% RDA), and also received magnesium (17.7% RDA) and potassium (9.4% RDA) in the form of asparaginate. Body composition, biochemical parameters and micronutrient status (blood serum level of vitamins C, D, B6, B12, folate, potassium, calcium, magnesium, zinc, phosphorus) were evaluated in all patients before and after the 3-week course of diet therapy. After the low-calorie diet therapy average body weight reduction was 4.2 +/- 0.2 kg in the main group, and 4.4 +/- 0.1 kg in the control group, without statistically significant differences between groups. Statistically significant decrease of total cholesterol, triglycerides, and glucose concentration in blood serum was registered in both groups. It should be noted that in the control group glycemia decreased on 1.2 +/- 0.1 mmol/l, while the main group showed a decrease on 1.8 +/- 0.1 (p l). Initial assessment of vitamin and mineral status revealed that most patients were optimal supplied with vitamins and minerals. After the dietotherapy significant increase of vitamin C, 25-hydroxyvitamin D, vitamin B6, folate, vitamin B12, potassium, magnesium, calcium, zinc and

  9. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  10. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  11. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  12. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  13. Creating and connecting recommended practices for reproducible research through collaborative culture and consensus in the Research Data Alliance

    Science.gov (United States)

    Parsons, M. A.; Yarmey, L.; Dillo, I.

    2017-12-01

    Data are the foundation of a robust, efficient, and reproducible scientific enterprise. The Research Data Alliance (RDA) is a community-driven, action-oriented, virtual organization committed to enabling open sharing and reuse of data by building social and technical bridges. The international RDA community includes almost 6000 members bringing diverse perspectives, domain knowledge, and expertise to a common table for identification of common challenges and holistic solutions. RDA members work together to identify common interests and form exploratory Interest Groups and outcome-oriented Working Groups. Participants exchange knowledge, share discoveries, discuss barriers and potential solutions, articulate policies, and align standards to enhance and facilitate global data sharing within and across domains and communities. With activities defined and led by members, RDA groups have organically been addressing issues across the full research cycle with community-ratified Recommendations and other outputs that begin to create the components of a global, data-sharing infrastructure. This paper examines how multiple RDA Recommendations can be implemented together to improve data and information discoverability, accessibility, and interconnection by both people and machines. For instance, the Persistent Identifier Types can support moving data across platforms through the Data Description Registry Interoperability framework following the RDA/WDS Publishing Data Workflows model. The Scholix initiative connects scholarly literature and data across numerous stakeholders can draw on the Practical Policy best practices for machine-actionable data policies. Where appropriate, we use a case study approach built around several flagship data sets from the Deep Carbon Observatory to examine how multiple RDA Recommendations can be implemented in actual practice.

  14. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  15. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Austregesilo, H.; Velkov, K. [GRS, Garching (Germany)] [and others

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  16. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    International Nuclear Information System (INIS)

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-01-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes

  17. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  18. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  19. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  20. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  1. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  2. Strict optical orthogonal codes for purely asynchronous code-division multiple-access applications

    Science.gov (United States)

    Zhang, Jian-Guo

    1996-12-01

    Strict optical orthogonal codes are presented for purely asynchronous optical code-division multiple-access (CDMA) applications. The proposed code can strictly guarantee the peaks of its cross-correlation functions and the sidelobes of any of its autocorrelation functions to have a value of 1 in purely asynchronous data communications. The basic theory of the proposed codes is given. An experiment on optical CDMA systems is also demonstrated to verify the characteristics of the proposed code.

  3. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  4. Quantum Codes From Negacyclic Codes over Group Ring ( Fq + υFq) G

    International Nuclear Information System (INIS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-01-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring ( F q + υF q ) G . By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over F q . (paper)

  5. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  6. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  7. The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access

    Science.gov (United States)

    Schuster, D.; Worley, S. J.

    2013-12-01

    The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is

  8. ComboCoding: Combined intra-/inter-flow network coding for TCP over disruptive MANETs

    Directory of Open Access Journals (Sweden)

    Chien-Chia Chen

    2011-07-01

    Full Text Available TCP over wireless networks is challenging due to random losses and ACK interference. Although network coding schemes have been proposed to improve TCP robustness against extreme random losses, a critical problem still remains of DATA–ACK interference. To address this issue, we use inter-flow coding between DATA and ACK to reduce the number of transmissions among nodes. In addition, we also utilize a “pipeline” random linear coding scheme with adaptive redundancy to overcome high packet loss over unreliable links. The resulting coding scheme, ComboCoding, combines intra-flow and inter-flow coding to provide robust TCP transmission in disruptive wireless networks. The main contributions of our scheme are twofold; the efficient combination of random linear coding and XOR coding on bi-directional streams (DATA and ACK, and the novel redundancy control scheme that adapts to time-varying and space-varying link loss. The adaptive ComboCoding was tested on a variable hop string topology with unstable links and on a multipath MANET with dynamic topology. Simulation results show that TCP with ComboCoding delivers higher throughput than with other coding options in high loss and mobile scenarios, while introducing minimal overhead in normal operation.

  9. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  10. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  11. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  12. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  13. Culture and National Well-Being: Should Societies Emphasize Freedom or Constraint?

    Science.gov (United States)

    Harrington, Jesse R; Boski, Pawel; Gelfand, Michele J

    2015-01-01

    Throughout history and within numerous disciplines, there exists a perennial debate about how societies should best be organized. Should they emphasize individual freedom and autonomy or security and constraint? Contrary to proponents who tout the benefits of one over the other, we demonstrate across 32 nations that both freedom and constraint exhibit a curvilinear relationship with many indicators of societal well-being. Relative to moderate nations, very permissive and very constrained nations exhibit worse psychosocial outcomes (lower happiness, greater dysthymia, higher suicide rates), worse health outcomes (lower life expectancy, greater mortality rates from cardiovascular disease and diabetes) and poorer economic and political outcomes (lower gross domestic product per capita, greater risk for political instability). This supports the notion that a balance between freedom and constraint results in the best national outcomes. Accordingly, it is time to shift the debate away from either constraint or freedom and focus on both in moderation.

  14. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jin; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures.

  15. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2004-01-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures

  16. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  17. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  18. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  19. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  20. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  1. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  2. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  3. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  4. Tri-code inductance control rod position indicator with several multi-coding-bars

    International Nuclear Information System (INIS)

    Shi Jibin; Jiang Yueyuan; Wang Wenran

    2004-01-01

    A control rod position indicator named as tri-code inductance control rod position indicator with multi-coding-bars, which possesses simple structure, reliable operation and high precision, is developed. The detector of the indicator is composed of K coils, a compensatory coil and K coding bars. Each coding bar consists of several sections of strong magnetic cores, several sections of weak magnetic cores and several sections of non-magnetic portions. As the control rod is withdrawn, the coding bars move in the center of the coils respectively, while the constant alternating current passes the coils and makes them to create inductance alternating voltage signals. The outputs of the coils are picked and processed, and the tri-codes indicating rod position can be gotten. Moreover, the coding principle of the detector and its related structure are introduced. The analysis shows that the indicator owns more advantage over the coils-coding rod position indicator, so it can meet the demands of the rod position indicating in nuclear heating reactor (NHR). (authors)

  5. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  6. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  7. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  8. The Ontario Energy Board`s draft standard supply service code: effects on air quality

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, J.; Bjorkquist, S. [Ontario Clean Air Alliance, Toronto, ON (Canada)

    1999-06-29

    The Ontario Clean Air Alliance (OCAA), a coalition of 67 organizations, takes issue with the Ontario Energy Board`s draft document `Standard Supply Service Code`, particularly sections 2.2.2. and 2.5.2 which they claim are not in the public interest unless the Ontario government implements the OCAA`s recommended emission caps. The alliance is of the view that without strict new environmental regulations the proposed Code would encourage the use of coal for electricity generation. Public health, the environment, consumer interests, job creation and promotion of a competitive electricity market would all be jeopardized by this development, the alliance states. The argument is supported by extensive reference to the Final Report of the Ontario Market Design Committee (MDC) which also emphasized the importance of combining the introduction of competition with appropriate environmental regulations, singling out the emission cap and trade program, and recommending that it be launched concurrently with the electricity market opening for competition. The view of the MDC was that public support for restructuring would not be forthcoming in the absence of regulatory measures to control power plant emissions. 25 refs.

  9. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  10. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  11. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  12. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  13. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  14. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  15. A study on the application of CRUDTRAN code in primary systems of domestic pressurized heavy-water reactors for prediction of radiation source term

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jong Soon; Cho, Hoon Jo; Jung, Min Young; Lee, Sang Heon [Dept. of Nuclear Engineering, Chosun University, Gwangju (Korea, Republic of)

    2017-04-15

    The importance of developing a source-term assessment technology has been emphasized owing to the decommissioning of Kori nuclear power plant (NPP) Unit 1 and the increase of deteriorated NPPs. We analyzed the behavioral mechanism of corrosion products in the primary system of a pressurized heavy-water reactor-type NPP. In addition, to check the possibility of applying the CRUDTRAN code to a Canadian Deuterium Uranium Reactor (CANDU)-type NPP, the type was assessed using collected domestic onsite data. With the assessment results, it was possible to predict trends according to operating cycles. Values estimated using the code were similar to the measured values. The results of this study are expected to be used to manage the radiation exposures of operators in high-radiation areas and to predict decommissioning processes in the primary system.

  16. Fitness Promotion for Adolescent Girls: The Impact and Effectiveness of Promotional Material which Emphasizes the Slim Ideal.

    Science.gov (United States)

    Shaw, Susan M.; Kemeny, Lidia

    1989-01-01

    Looked at techniques for promoting fitness participation among adolescent girls, in particular those which emphasize the slim ideal. Relative effectiveness of posters using different models (slim, average, overweight) and different messages (slimness, activity, health) was tested using 627 female high school students. Found slim model to be most…

  17. Culture and National Well-Being: Should Societies Emphasize Freedom or Constraint?

    Directory of Open Access Journals (Sweden)

    Jesse R Harrington

    Full Text Available Throughout history and within numerous disciplines, there exists a perennial debate about how societies should best be organized. Should they emphasize individual freedom and autonomy or security and constraint? Contrary to proponents who tout the benefits of one over the other, we demonstrate across 32 nations that both freedom and constraint exhibit a curvilinear relationship with many indicators of societal well-being. Relative to moderate nations, very permissive and very constrained nations exhibit worse psychosocial outcomes (lower happiness, greater dysthymia, higher suicide rates, worse health outcomes (lower life expectancy, greater mortality rates from cardiovascular disease and diabetes and poorer economic and political outcomes (lower gross domestic product per capita, greater risk for political instability. This supports the notion that a balance between freedom and constraint results in the best national outcomes. Accordingly, it is time to shift the debate away from either constraint or freedom and focus on both in moderation.

  18. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  19. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  20. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  1. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  2. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  3. Evaluation of some essential and trace elements in diets from 3 nurseries from Juiz de Fora, M.G., Brazil, by neutron activation analysis

    International Nuclear Information System (INIS)

    Favaro, D.I.T.; Maihara, V.A.; Zangrande, K.C.; Rodrigues, M.I.; Vasconcellos, M.B.A.; Chicourel, E.L.; Barra, L.G.; Cozzolino, S.M.F.

    2001-01-01

    A study was made in diets offered to a group of pre-school children, whose mean age was 67 months and remained the whole day in three day care centers from Juiz de Fora, M.G., Brazil. For sampling, the duplicate portion technique was used, and the diets were collected and analyzed separately each day in the 3 nurseries. Instrumental neutron activation analysis was applied to the determination of 16 elements. The daily dietary intake values were compared to the RDA (children 4-6 years old). Based on this reference, Ca, Fe, Se and Zn were found to be deficient, Mg and Mn were comparable to the RDA and the Cl and Na concentrations were higher compared to their RDA. For the other elements measured, there are no RDA's for children. (author)

  4. Codes maintained by the LAACG [Los Alamos Accelerator Code Group] at the NMFECC

    International Nuclear Information System (INIS)

    Wallace, R.; Barts, T.

    1990-01-01

    The Los Alamos Accelerator Code Group (LAACG) maintains two groups of design codes at the National Magnetic Fusion Energy Computing Center (NMFECC). These codes, principally electromagnetic field solvers, are used for the analysis and design of electromagnetic components for accelerators, e.g., magnets, rf structures, pickups, etc. In this paper, the status and future of the installed codes will be discussed with emphasis on an experimental version of one set of codes, POISSON/SUPERFISH

  5. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  6. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  7. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  8. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  9. Innovations and Changes of Procedure Civil Code, and Maintenance of the Subjectivism Term "Resources Insufficient" for Granting of Justice of Gratuity

    Directory of Open Access Journals (Sweden)

    Juliane Dziubate Krefta

    2016-10-01

    Full Text Available This work is focused on the analysis of Gratuity of Justice as an important mechanism of access to justice. It aims to address the news about the Gratuity of Justice in the current Civil Procedure Code, recently amended by Law nº. 13,105/2015, bringing out important changes that fixed the obsolete Law nº. 1,060/50. Aims to reflect about the requirement for granting the benefit of free justice, namely the lack of resources, and its high degree of subjectivity, emphasizing some positive and negative issues related to it.

  10. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  11. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  12. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  13. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  14. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  15. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  16. Different nonideality relationships, different databases and their effects on modeling precipitation from concentrated solutions using numerical speciation codes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.F.; Ebinger, M.H.

    1996-08-01

    Four simple precipitation problems are solved to examine the use of numerical equilibrium codes. The study emphasizes concentrated solutions, assumes both ideal and nonideal solutions, and employs different databases and different activity-coefficient relationships. The study uses the EQ3/6 numerical speciation codes. The results show satisfactory material balances and agreement between solubility products calculated from free-energy relationships and those calculated from concentrations and activity coefficients. Precipitates show slightly higher solubilities when the solutions are regarded as nonideal than when considered ideal, agreeing with theory. When a substance may precipitate from a solution dilute in the precipitating substance, a code may or may not predict precipitation, depending on the database or activity-coefficient relationship used. In a problem involving a two-component precipitation, there are only small differences in the precipitate mass and composition between the ideal and nonideal solution calculations. Analysis of this result indicates that this may be a frequent occurrence. An analytical approach is derived for judging whether this phenomenon will occur in any real or postulated precipitation situation. The discussion looks at applications of this approach. In the solutes remaining after the precipitations, there seems to be little consistency in the calculated concentrations and activity coefficients. They do not appear to depend in any coherent manner on the database or activity-coefficient relationship used. These results reinforce warnings in the literature about perfunctory or mechanical use of numerical speciation codes.

  17. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  18. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed

  19. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  20. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  1. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  2. Features of RAPTA-SFD code modelling of chemical interactions of basic materials of the WWER active zone in accident conditions with severe fuel damage

    International Nuclear Information System (INIS)

    Bibilashvili, Yu.K.; Sokolov, N.B.; Salatov, A.V.; Nechaeva, O.A.; Andreyeva-Andrievskaya, L.N.; Vlasov, F.Yu.

    1996-01-01

    A brief description of RAPTA-SFD code intended for computer simulations of WWER-type fuel elements (simulator or absorber element) in conditions of accident with severe damage of fuel. Presented are models of chemical interactions of basic materials of the active zone, emphasized are special feature of their application in carrying out of the CORA-W2 experiment within the framework of International Standard Problem ISP-36. Results obtained confirm expediency of phenomenological models application. (author). 6 refs, 7 figs, 1 tab

  3. Rehabilitation After Hamstring-Strain Injury Emphasizing Eccentric Strengthening at Long Muscle Lengths: Results of Long-Term Follow-Up.

    Science.gov (United States)

    Tyler, Timothy F; Schmitt, Brandon M; Nicholas, Stephen J; McHugh, Malachy P

    2017-04-01

    Hamstring-strain injuries have a high recurrence rate. To determine if a protocol emphasizing eccentric strength training with the hamstrings in a lengthened position resulted in a low recurrence rate. Longitudinal cohort study. Sports-medicine physical therapy clinic. Fifty athletes with hamstring-strain injury (age 36 ± 16 y; 30 men, 20 women; 3 G1, 43 G2, 4 G3; 25 recurrent injuries) followed a 3-phase rehabilitation protocol emphasizing eccentric strengthening with the hamstrings in a lengthened position. Injury recurrence; isometric hamstring strength at 80°, 60°, 40°, and 20° knee flexion in sitting with the thigh flexed to 40° above the horizontal and the seat back at 90° to the horizontal (strength tested before return to sport). Four of the 50 athletes sustained reinjuries between 3 and 12 mo after return to sport (8% recurrence rate). The other 42 athletes had not sustained a reinjury at an average of 24 ± 12 mo after return to sport. Eight noncompliant athletes did not complete the rehabilitation and returned to sport before initiating eccentric strengthening in the lengthened state. All 4 reinjuries occurred in these noncompliant athletes. At time of return to sport, compliant athletes had full restoration of strength while noncompliant athletes had significant hamstring weakness, which was progressively worse at longer muscle lengths (compliance × side × angle P = .006; involved vs noninvolved at 20°, compliant 7% stronger, noncompliant 43% weaker). Compliance with rehabilitation emphasizing eccentric strengthening with the hamstrings in a lengthened position resulted in no reinjuries.

  4. Coding and decoding for code division multiple user communication systems

    Science.gov (United States)

    Healy, T. J.

    1985-01-01

    A new algorithm is introduced which decodes code division multiple user communication signals. The algorithm makes use of the distinctive form or pattern of each signal to separate it from the composite signal created by the multiple users. Although the algorithm is presented in terms of frequency-hopped signals, the actual transmitter modulator can use any of the existing digital modulation techniques. The algorithm is applicable to error-free codes or to codes where controlled interference is permitted. It can be used when block synchronization is assumed, and in some cases when it is not. The paper also discusses briefly some of the codes which can be used in connection with the algorithm, and relates the algorithm to past studies which use other approaches to the same problem.

  5. The identification of candidate rice genes that confer resistance to the brown planthopper (Nilaparvata lugens) through representational difference analysis.

    Science.gov (United States)

    Park, Dong-Soo; Lee, Sang-Kyu; Lee, Jong-Hee; Song, Min-Young; Song, Song-Yi; Kwak, Do-Yeon; Yeo, Un-Sang; Jeon, Nam-Soo; Park, Soo-Kwon; Yi, Gihwan; Song, You-Chun; Nam, Min-Hee; Ku, Yeon-Chung; Jeon, Jong-Seong

    2007-08-01

    The development of rice varieties (Oryza sativa L.) that are resistant to the brown planthopper (BPH; Nilaparvata lugens Stål) is an important objective in current breeding programs. In this study, we generated 132 BC(5)F(5) near-isogenic rice lines (NILs) by five backcrosses of Samgangbyeo, a BPH resistant indica variety carrying the Bph1 locus, with Nagdongbyeo, a BPH susceptible japonica variety. To identify genes that confer BPH resistance, we employed representational difference analysis (RDA) to detect transcripts that were exclusively expressed in one of our BPH resistant NIL, SNBC61, during insect feeding. The chromosomal mapping of the RDA clones that we subsequently isolated revealed that they are located in close proximity either to known quantitative trait loci or to an introgressed SSR marker from the BPH resistant donor parent Samgangbyeo. Genomic DNA gel-blot analysis further revealed that loci of all RDA clones in SNBC61 correspond to the alleles of Samgangbyeo. Most of the RDA clones were found to be exclusively expressed in SNBC61 and could be assigned to functional groups involved in plant defense. These RDA clones therefore represent candidate defense genes for BPH resistance.

  6. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  7. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  8. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  9. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  10. CITOPP, CITMOD, CITWI, Processing codes for CITATION Code

    International Nuclear Information System (INIS)

    Albarhoum, M.

    2008-01-01

    Description of program or function: CITOPP processes the output file of the CITATION 3-D diffusion code. The program can plot axial, radial and circumferential flux distributions (in cylindrical geometry) in addition to the multiplication factor convergence. The flux distributions can be drawn for each group specified by the program and visualized on the screen. CITMOD processes both the output and the input files of the CITATION 3-D diffusion code. CITMOD can visualize both axial, and radial-angular models of the reactor described by CITATION input/output files. CITWI processes the input file (CIT.INP) of CITATION 3-D diffusion code. CIT.INP is processed to deduce the dimensions of the cell whose cross sections can be representative of the homonym reactor component in section 008 of CIT.INP

  11. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  12. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  13. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  14. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  15. Coding training for medical students: How good is diagnoses coding with ICD-10 by novices?

    Directory of Open Access Journals (Sweden)

    Stausberg, Jürgen

    2005-04-01

    Full Text Available Teaching of knowledge and competence in documentation and coding is an essential part of medical education. Therefore, coding training had been placed within the course of epidemiology, medical biometry, and medical informatics. From this, we can draw conclusions about the quality of coding by novices. One hundred and eighteen students coded diagnoses from 15 nephrological cases in homework. In addition to interrater reliability, validity was calculated by comparison with a reference coding. On the level of terminal codes, 59.3% of the students' results were correct. The completeness was calculated as 58.0%. The results on the chapter level increased up to 91.5% and 87.7% respectively. For the calculation of reliability a new, simple measure was developed that leads to values of 0.46 on the level of terminal codes and 0.87 on the chapter level for interrater reliability. The figures of concordance with the reference coding are quite similar. In contrary, routine data show considerably lower results with 0.34 and 0.63 respectively. Interrater reliability and validity of coding by novices is as good as coding by experts. The missing advantage of experts could be explained by the workload of documentation and a negative attitude to coding on the one hand. On the other hand, coding in a DRG-system is handicapped by a large number of detailed coding rules, which do not end in uniform results but rather lead to wrong and random codes. Anyway, students left the course well prepared for coding.

  16. Turbo coding, turbo equalisation and space-time coding for transmission over fading channels

    CERN Document Server

    Hanzo, L; Yeap, B

    2002-01-01

    Against the backdrop of the emerging 3G wireless personal communications standards and broadband access network standard proposals, this volume covers a range of coding and transmission aspects for transmission over fading wireless channels. It presents the most important classic channel coding issues and also the exciting advances of the last decade, such as turbo coding, turbo equalisation and space-time coding. It endeavours to be the first book with explicit emphasis on channel coding for transmission over wireless channels. Divided into 4 parts: Part 1 - explains the necessary background for novices. It aims to be both an easy reading text book and a deep research monograph. Part 2 - provides detailed coverage of turbo conventional and turbo block coding considering the known decoding algorithms and their performance over Gaussian as well as narrowband and wideband fading channels. Part 3 - comprehensively discusses both space-time block and space-time trellis coding for the first time in literature. Par...

  17. Decoding Xing-Ling codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2002-01-01

    This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed.......This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed....

  18. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  19. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  20. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  1. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  2. MARS code manual volume I: code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Yoon, Churl

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This theory manual provides a complete list of overall information of code structure and major function of MARS including code architecture, hydrodynamic model, heat structure, trip / control system and point reactor kinetics model. Therefore, this report would be very useful for the code users. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  3. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  4. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  5. The Effects of Different Teaching, Research, and Service Emphases on Individual and Organizational Outcomes in Higher Education Institutions

    Science.gov (United States)

    Terpstra, David E.; Honoree, Andre L.

    2009-01-01

    The authors investigated the relative emphasis that educators give to teaching, research, and service in the business discipline and 4 other academic disciplines. The authors also investigated the effects of different faculty activity emphases on faculty teaching effectiveness, research performance, service levels, job and pay satisfaction,…

  6. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  7. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  8. Binding by asynchrony: the neuronal phase code

    Directory of Open Access Journals (Sweden)

    Zoltan Nadasdy

    2010-09-01

    Full Text Available Neurons display continuous subthreshold oscillations and discrete action potentials. When action potentials are phase-locked to the subthreshold oscillation, we hypothesize they represent two types of information: the presence/absence of a sensory feature and the phase of subthreshold oscillation. If subthreshold oscillation phases are neuron-specific, then the sources of action potentials can be recovered based on the action potential times. If the spatial information about the stimulus is converted to action potential phases, then action potentials from multiple neurons can be combined into a single axon and the spatial configuration reconstructed elsewhere. For the reconstruction to be successful, we introduce two assumptions: that a subthreshold oscillation field has a constant phase gradient and that coincidences between action potentials and intracellular subthreshold oscillations are neuron-specific as defined by the "interference principle." Under these assumptions, a phase coding model enables information transfer between structures and reproduces experimental phenomenons such as phase precession, grid cell architecture, and phase modulation of cortical spikes. This article reviews a recently proposed neuronal algorithm for information encoding and decoding from the phase of action potentials (Nadasdy 2009. The focus is given to the principles common across different systems instead of emphasizing system specific differences.

  9. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  11. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  12. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  13. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  14. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  15. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  16. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  17. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    Science.gov (United States)

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  18. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  19. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  20. CODES AND PRACTICES OF IMPLEMENTATION OF CORPORATE GOVERNANCE IN ROMANIA AND RESULTS REPORTING

    Directory of Open Access Journals (Sweden)

    GROSU MARIA

    2011-12-01

    Full Text Available Corporate governance refers to the manner in which companies are directed and controlled. Business management was always guided by certain principles, but the current meaning of corporate governance concerns and the contribution that companies must have the overall development of modern society. Romania used quite late in adopting a code of good practice in corporate governance, being driven, in particular, the privatization process, but also the transfer of control and surveillance of political organizations by the Board of Directors (BD. Adoption of codes of corporate governance is necessary to harmonize internal business requirements of a functioning market economy. In addition, the CEE countries, the European Commission adopted an action plan announcing measures to modernize company law and enhance corporate governance. Romania takes steps in this direction by amending the Company Law, and other regulations, although the practice does not necessarily keep pace with the requirements. This study aims on the one hand, an analysis of the evolution of corporate governance codes adopted in Romania, but also an empirical research of the implementation of corporate governance principles of a representative sample of companies listed on the Bucharest Stock Exchange (BSE. Consider relevant research methodology, because the issuer of the Codes of CG in Romania is BSE listed companies requesting their voluntary implementation. Implementation results are summarized and interpreted at the expense of public reports of the companies studied. Most studies undertaken in this direction have been made on multinational companies which respects the rule of corporate governance codes of countries of origin. In addition, many studies also emphasize the fair treatment of stakeholders rather than on models of governance adopted (monist/dualist with implications for optimizing economic objectives but also social. Undertaken research attempts to highlight on the one

  1. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  2. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  3. Validation of the ASSERT subchannel code: Prediction of critical heat flux in standard and nonstandard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1995-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of Canada uranium deuterium (CANDU) pressurized heavy water reactor fuel channels and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting CHF at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental database. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. The numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology are discussed. The evolutionary validation plan is also discussed and early validation exercises are summarized. More recent validation exercises in standard and nonstandard geometries are emphasized

  4. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  5. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  6. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  7. Beam-dynamics codes used at DARHT

    Energy Technology Data Exchange (ETDEWEB)

    Ekdahl, Jr., Carl August [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  8. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  9. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    Energy Technology Data Exchange (ETDEWEB)

    Camous, F.; Jacq, F.; Chatelard, P. [IPSN/DRS/SEMAR CE-Cadarache, St Paul Lez Durance (France)] [and others

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  10. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  11. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  12. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  13. Status of reactor core design code system in COSINE code package

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.; Yu, H.; Liu, Z., E-mail: yuhui@snptc.com.cn [State Nuclear Power Software Development Center, SNPTC, National Energy Key Laboratory of Nuclear Power Software (NEKLS), Beijiing (China)

    2014-07-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  14. Status of reactor core design code system in COSINE code package

    International Nuclear Information System (INIS)

    Chen, Y.; Yu, H.; Liu, Z.

    2014-01-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  15. Further Generalisations of Twisted Gabidulin Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde, Johan Sebastian Heesemann; Sheekey, John

    2017-01-01

    We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes.......We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes....

  16. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  17. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat this latter effect permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The benchmark exercise has resolved a potentially dangerous inadequacy in dissolver calculations. (author)

  18. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  19. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  20. Application of Quantum Gauss-Jordan Elimination Code to Quantum Secret Sharing Code

    Science.gov (United States)

    Diep, Do Ngoc; Giang, Do Hoang; Phu, Phan Huy

    2018-03-01

    The QSS codes associated with a MSP code are based on finding an invertible matrix V, solving the system vATMB (s a)=s. We propose a quantum Gauss-Jordan Elimination Procedure to produce such a pivotal matrix V by using the Grover search code. The complexity of solving is of square-root order of the cardinal number of the unauthorized set √ {2^{|B|}}.

  1. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  2. High protein diets do not attenuate decrements in testosterone and IGF-I during energy deficit.

    Science.gov (United States)

    Henning, Paul C; Margolis, Lee M; McClung, James P; Young, Andrew J; Pasiakos, Stefan M

    2014-05-01

    Energy deficit (ED) diminishes fat-free mass (FFM) with concomitant reductions in anabolic hormone secretion. A modest increase in protein to recommended dietary allowance (RDA) levels during ED minimally attenuates decrements in insulin-like growth factor-I (IGF-I). The impact of dietary protein above the RDA on circulating anabolic hormones and their relationships with FFM in response to ED are not well described. Thirty-three adults were assigned diets providing protein at 0.8 (RDA), 1.6 (2×-RDA), and 2.4 (3×-RDA) g/kg/d for 31days. Testosterone, sex-hormone binding globulin (SHBG) and IGF-I system components were assessed after a 10-day period of weight-maintenance (WM) and after a 21-day period of ED (40%) achieved by an increase in energy expenditure and decreased energy intake. Associations between the change in FFM and anabolic hormone levels were determined. As compared to WM and regardless of dietary protein intake, total and free testosterone, total IGF-I, and acid-labile subunit decreased (Phormones or IGF-I system components measured. Changes in FFM in response to ED were negatively associated with acid-labile subunit (ALS) (r=-0.62, Phormone concentrations. Published by Elsevier Inc.

  3. The Coding Causes of Death in HIV (CoDe) Project: initial results and evaluation of methodology

    DEFF Research Database (Denmark)

    Kowalska, Justyna D; Friis-Møller, Nina; Kirk, Ole

    2011-01-01

    The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies.......The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies....

  4. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  5. Melanotic neuroectodermal tumor of infancy (progonoma): a case report emphasizing the computed tomography findings and literature review

    International Nuclear Information System (INIS)

    Araujo Junior, Cyrillo Rodrigues de; Carvalho, Tarcisio Nunes; Fraguas Filho, Sergio Roberto; Costa, Marlos Augusto Bitencourt; Borba, Ana Olivia Cardoso; Figueiredo, Sizenildo da Silva; Machado, Marcio Martins; Teixeira, Kim-Ir-Sen Santos

    2004-01-01

    The melanotic neuroectodermal tumor of infancy, also known as progonoma, is a rare benign disease of neural crest origin that occurs within the first year of life and affects mainly the maxilla. The authors report a case of a 10-month-old child presenting with this uncommon tumor in the maxilla, emphasizing the diagnostic findings on computed tomography, and present a literature review. (author)

  6. Towers of generalized divisible quantum codes

    Science.gov (United States)

    Haah, Jeongwan

    2018-04-01

    A divisible binary classical code is one in which every code word has weight divisible by a fixed integer. If the divisor is 2ν for a positive integer ν , then one can construct a Calderbank-Shor-Steane (CSS) code, where X -stabilizer space is the divisible classical code, that admits a transversal gate in the ν th level of Clifford hierarchy. We consider a generalization of the divisibility by allowing a coefficient vector of odd integers with which every code word has zero dot product modulo the divisor. In this generalized sense, we construct a CSS code with divisor 2ν +1 and code distance d from any CSS code of code distance d and divisor 2ν where the transversal X is a nontrivial logical operator. The encoding rate of the new code is approximately d times smaller than that of the old code. In particular, for large d and ν ≥2 , our construction yields a CSS code of parameters [[O (dν -1) ,Ω (d ) ,d ] ] admitting a transversal gate at the ν th level of Clifford hierarchy. For our construction we introduce a conversion from magic state distillation protocols based on Clifford measurements to those based on codes with transversal T gates. Our tower contains, as a subclass, generalized triply even CSS codes that have appeared in so-called gauge fixing or code switching methods.

  7. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  8. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation.

  9. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    International Nuclear Information System (INIS)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation

  10. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  11. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  12. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  13. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  14. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  15. Under-coding of secondary conditions in coded hospital health data: Impact of co-existing conditions, death status and number of codes in a record.

    Science.gov (United States)

    Peng, Mingkai; Southern, Danielle A; Williamson, Tyler; Quan, Hude

    2017-12-01

    This study examined the coding validity of hypertension, diabetes, obesity and depression related to the presence of their co-existing conditions, death status and the number of diagnosis codes in hospital discharge abstract database. We randomly selected 4007 discharge abstract database records from four teaching hospitals in Alberta, Canada and reviewed their charts to extract 31 conditions listed in Charlson and Elixhauser comorbidity indices. Conditions associated with the four study conditions were identified through multivariable logistic regression. Coding validity (i.e. sensitivity, positive predictive value) of the four conditions was related to the presence of their associated conditions. Sensitivity increased with increasing number of diagnosis code. Impact of death on coding validity is minimal. Coding validity of conditions is closely related to its clinical importance and complexity of patients' case mix. We recommend mandatory coding of certain secondary diagnosis to meet the need of health research based on administrative health data.

  16. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  17. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  18. Designing Skin Cancer Prevention Messages: Should We Emphasize Gains or Losses? Message Framing, Risk Type, and Prior Experience.

    Science.gov (United States)

    Lee, Moon J; Kang, Hannah

    2018-05-01

    To test whether message framing (ie, gain vs. loss) and risk type (ie, health vs appearance risk) in skin cancer prevention messages interact with one's prior experience. Two experiments with a 2 (message framing: gain vs loss) × 2 (risk type: health vs appearance risk) factorial design were conducted. The participants were given a URL to the experiment website via e-mail. On the first page of the website, the participants were told that they would be asked to evaluate a skin cancer print public service announcement (PSA): Online experiments. A total of 397 individuals participated (236 for experiment 1 and 161 for experiment 2). Apparatus: Four versions of the skin cancer print PSAs were developed. Four PSAs were identical except for the 2 manipulated components: message framing and risk type. Measures were adopted from Cho and Boster (message framing), Jones and Leary and Kiene et al. (risk type), De Vries, Mesters, van't Riet, Willems, and Reubsaet and Knight, Kirincich, Farmer, and Hood (prior experience), and Hammond, Fong, Zanna, Thrasher, and Borland and Hoffner and Ye (behavioral intent). General linear models were used to test hypotheses. Three-way interactions among message framing, risk type, and prior experience were found: When the intent of the message was to encourage sunscreen use, the effects of message framing and risk type were shown to be the exact opposite directions from when the intent was to discourage indoor/outdoor tanning. To discourage tanning among those with prior experience, messages emphasizing losses in terms of one's health will work better. For those with no prior experience, messages emphasizing potential appearance losses will work better for discouraging tanning while messages emphasizing gains like improving appearance will do a better job in encouraging sunscreen use.

  19. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  20. A Curriculum Development for the Enhancement of Learning Management Performances Emphasizing Higher Order Thinking Skills for Lower Secondary Science Teachers

    Directory of Open Access Journals (Sweden)

    Saksit Seeluangpetch

    2016-12-01

    Full Text Available This study aimed at 1 investigating the problems and needs for the enhancement of learning management performances emphasizing the higher order thinking skills for lower secondary Science teachers, 2 developing an effective curriculum to enhance the learning management performances which emphasized the higher order thinking skills for lower secondary Science teachers, and 3 studying the effects of using the curriculum developed for the enhancement of learning management performances emphasizing the higher order thinking skills for lower secondary Science teachers. The research was conducted in 4 phases. Phase 1 of the research was the study of fundamental information regarding problems and needs for the enhancement of learning management performances emphasizing the higher order thinking skills for lower secondary Science teachers. It was carried out by studying the related literature and exploring the needs. The instrument used in Phase 1 study was the needs assessment. The statistics used for data analysis were mean ( , percentage (%, and standard deviation (S.D.. The result of the study revealed that the Science teachers’ prior knowledge was at low level and the need to enhance their performances was at high level. The development of the curriculum was carried out in Phase 2 of the study. The curriculum was constructed and developed in order to enhance the learning management performances which emphasized the higher order thinking skills. The instrument used was the appropriateness the assessment of the curriculum framework. Mean ( , percentage (%, and standard deviation (S.D. were used to analyze the data. The result of the assessment showed that the overall appropriateness of the curriculum was at high level. The main components of the curriculum comprised of curriculum’s problem and necessity, rationale, objective, structure, training activity, training media, training duration, and evaluation and assessment. The curriculum trial was

  1. Convolutional coding techniques for data protection

    Science.gov (United States)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  2. 21 CFR 106.90 - Coding.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD... of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity with the coding requirements that are applicable to thermally processed low-acid foods packaged in...

  3. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  4. Coding In-depth Semistructured Interviews

    DEFF Research Database (Denmark)

    Campbell, John L.; Quincy, Charles; Osserman, Jordan

    2013-01-01

    Many social science studies are based on coded in-depth semistructured interview transcripts. But researchers rarely report or discuss coding reliability in this work. Nor is there much literature on the subject for this type of data. This article presents a procedure for developing coding schemes...... useful for situations where a single knowledgeable coder will code all the transcripts once the coding scheme has been established. This approach can also be used with other types of qualitative data and in other circumstances....

  5. Polynomial weights and code constructions

    DEFF Research Database (Denmark)

    Massey, J; Costello, D; Justesen, Jørn

    1973-01-01

    polynomial included. This fundamental property is then used as the key to a variety of code constructions including 1) a simplified derivation of the binary Reed-Muller codes and, for any primepgreater than 2, a new extensive class ofp-ary "Reed-Muller codes," 2) a new class of "repeated-root" cyclic codes...... of long constraint length binary convolutional codes derived from2^r-ary Reed-Solomon codes, and 6) a new class ofq-ary "repeated-root" constacyclic codes with an algebraic decoding algorithm.......For any nonzero elementcof a general finite fieldGF(q), it is shown that the polynomials(x - c)^i, i = 0,1,2,cdots, have the "weight-retaining" property that any linear combination of these polynomials with coefficients inGF(q)has Hamming weight at least as great as that of the minimum degree...

  6. Translator from the symbol coding language for the BUTs-20 processor of the in-core reactor control system

    International Nuclear Information System (INIS)

    Vorob'ev, D.M.; Golovanov, M.N.; Levin, G.L.; Parfenova, T.K.; Filatov, V.P.

    1978-01-01

    A symbolic-language code translator is described; it has been developed for automation of making up programs for in-core control systems. The translator is written in the ASSEMBLER language which is included in the software of the M-6000 computer. Two scannings of the source program are required for making up the operating program in the internal language of the BUTs-2O processor. The flowsheet and listing of the interrogation program of an analog-to-digital converter are presented. It is emphasized that the translator proposed allows a time reduction for constructing programs for the in-core control systems by a factor of 10-15 and an improvement of their quality

  7. QUIC: a chemical kinetics code for use with the chemical equilibrium code QUIL

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-10-01

    A chemical rate kinetics code QUIC is described, along with a support code RATE. QUIC is designed to allow chemical kinetics calculations on a wide variety of chemical environments while operating in the overlay environment of the chemical equilibrium code QUIL. QUIC depends upon a rate-data library called LIBR. This library is maintained by RATE. RATE enters into the library all reactions in a standardized format. The code QUIC, operating in conjunction with QUIL, is interactive and written to be used from a remote terminal, with paging control provided. Plotted output is also available

  8. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  9. Direct-semidirect (DSD) codes

    International Nuclear Information System (INIS)

    Cvelbar, F.

    1999-01-01

    Recent codes for direct-semidirect (DSD) model calculations in the form of answers to a detailed questionnaire are reviewed. These codes include those embodying the classical DSD approach covering only the transitions to the bound states (RAF, HIKARI, and those of the Bologna group), as well as the code CUPIDO++ that also treats transitions to unbound states. (author)

  10. Polar Coding for the Large Hadron Collider: Challenges in Code Concatenation

    CERN Document Server

    AUTHOR|(CDS)2238544; Podzorny, Tomasz; Uythoven, Jan

    2018-01-01

    In this work, we present a concatenated repetition-polar coding scheme that is aimed at applications requiring highly unbalanced unequal bit-error protection, such as the Beam Interlock System of the Large Hadron Collider at CERN. Even though this concatenation scheme is simple, it reveals significant challenges that may be encountered when designing a concatenated scheme that uses a polar code as an inner code, such as error correlation and unusual decision log-likelihood ratio distributions. We explain and analyze these challenges and we propose two ways to overcome them.

  11. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  12. Molecular tagging of the Bph1 locus for resistance to brown planthopper (Nilaparvata lugens Stål) through representational difference analysis.

    Science.gov (United States)

    Park, Dong-Soo; Song, Min-Young; Park, Soo-Kwon; Lee, Sang-Kyu; Lee, Jong-Hee; Song, Song-Yi; Eun, Moo Young; Hahn, Tae-Ryong; Sohn, Jae-Keun; Yi, Gihwan; Nam, Min-Hee; Jeon, Jong-Seong

    2008-08-01

    During brown planthopper (BPH) feeding on rice plants, we employed a modified representational difference analysis (RDA) method to detect rare transcripts among those differentially expressed in SNBC61, a BPH resistant near-isogenic line (NIL) carrying the Bph1 resistance gene. This identified 3 RDA clones: OsBphi237, OsBphi252 and OsBphi262. DNA gel-blot analysis revealed that the loci of the RDA clones in SNBC61 corresponded to the alleles of the BPH resistant donor Samgangbyeo. Expression analysis indicated that the RDA genes were up-regulated in SNBC61 during BPH feeding. Interestingly, analysis of 64 SNBC NILs, derived from backcrosses of Samgangbyeo with a BPH susceptible Nagdongbyeo, using a cleaved amplified polymorphic sequence (CAPS) marker indicated that OsBphi252, which encodes a putative lipoxygenase (LOX), co-segregates with BPH resistance. Our results suggest that OsBphi252 is tightly linked to Bph1, and may be useful in marker-assisted selection (MAS) for resistance to BPH.

  13. Construction and decoding of matrix-product codes from nested codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Lally, Kristine; Ruano, Diego

    2009-01-01

    We consider matrix-product codes [C1 ... Cs] · A, where C1, ..., Cs  are nested linear codes and matrix A has full rank. We compute their minimum distance and provide a decoding algorithm when A is a non-singular by columns matrix. The decoding algorithm decodes up to half of the minimum distance....

  14. Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)

    Science.gov (United States)

    Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.

    2013-10-01

    Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.

  15. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  16. The reason for having a code of pharmaceutical ethics: Spanish Pharmacists Code of Ethics

    Directory of Open Access Journals (Sweden)

    Ana Mulet Alberola

    2017-05-01

    Full Text Available The pharmacist profession needs its own code of conduct set out in writing to serve as a stimulus to pharmacists in their day-to-day work in the different areas of pharmacy, in conjunction always with each individual pharmacist´s personal commitment to their patients, to other healthcare professionals and to society. An overview is provided of the different codes of ethics for pharmacists on the national and international scale, the most up-to-date code for 2015 being presented as a set of principles which must guide a pharmacutical conduct from the standpoint of deliberative judgment. The difference between codes of ethics and codes of practice is discussed. In the era of massive-scale collaboration, this code is a project holding bright prospects for the future. Each individual pharmacutical attitude in practicing their profession must be identified with the pursuit of excellence in their own personal practice for the purpose of achieving the ethical and professional values above and beyond complying with regulations and code of practice.

  17. Usage of burnt fuel isotopic compositions from engineering codes in Monte-Carlo code calculations

    International Nuclear Information System (INIS)

    Aleshin, Sergey S.; Gorodkov, Sergey S.; Shcherenko, Anna I.

    2015-01-01

    A burn-up calculation of VVER's cores by Monte-Carlo code is complex process and requires large computational costs. This fact makes Monte-Carlo codes usage complicated for project and operating calculations. Previously prepared isotopic compositions are proposed to use for the Monte-Carlo code (MCU) calculations of different states of VVER's core with burnt fuel. Isotopic compositions are proposed to calculate by an approximation method. The approximation method is based on usage of a spectral functionality and reference isotopic compositions, that are calculated by engineering codes (TVS-M, PERMAK-A). The multiplication factors and power distributions of FA and VVER with infinite height are calculated in this work by the Monte-Carlo code MCU using earlier prepared isotopic compositions. The MCU calculation data were compared with the data which were obtained by engineering codes.

  18. An analytical demonstration of coupling schemes between magnetohydrodynamic codes and eddy current codes

    International Nuclear Information System (INIS)

    Liu Yueqiang; Albanese, R.; Rubinacci, G.; Portone, A.; Villone, F.

    2008-01-01

    In order to model a magnetohydrodynamic (MHD) instability that strongly couples to external conducting structures (walls and/or coils) in a fusion device, it is often necessary to combine a MHD code solving for the plasma response, with an eddy current code computing the fields and currents of conductors. We present a rigorous proof of the coupling schemes between these two types of codes. One of the coupling schemes has been introduced and implemented in the CARMA code [R. Albanese, Y. Q. Liu, A. Portone, G. Rubinacci, and F. Villone, IEEE Trans. Magn. 44, 1654 (2008); A. Portone, F. Villone, Y. Q. Liu, R. Albanese, and G. Rubinacci, Plasma Phys. Controlled Fusion 50, 085004 (2008)] that couples the MHD code MARS-F[Y. Q. Liu, A. Bondeson, C. M. Fransson, B. Lennartson, and C. Breitholtz, Phys. Plasmas 7, 3681 (2000)] and the eddy current code CARIDDI[R. Albanese and G. Rubinacci, Adv. Imaging Electron Phys. 102, 1 (1998)]. While the coupling schemes are described for a general toroidal geometry, we give the analytical proof for a cylindrical plasma.

  19. Preliminary investigation study of code of developed country for developing Korean fuel cycle code

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Ko, Won Il; Lee, Ho Hee; Cho, Dong Keun; Park, Chang Je

    2012-01-01

    In order to develop Korean fuel cycle code, the analyses has been performed with the fuel cycle codes which are used in advanced country. Also, recommendations were proposed for future development. The fuel cycle codes are AS FLOOWS: VISTA which has been developed by IAEA, DANESS code which developed by ANL and LISTO, and VISION developed by INL for the Advanced Fuel Cycle Initiative (AFCI) system analysis. The recommended items were proposed for software, program scheme, material flow model, isotope decay model, environmental impact analysis model, and economics analysis model. The described things will be used for development of Korean nuclear fuel cycle code in future

  20. Bridging the knowledge gap between Big Data producers and consumers

    Science.gov (United States)

    Peng, G. S.; Worley, S. J.

    2015-12-01

    Most weather data is produced, disseminated and consumed by expert users in large national operational centers or laboratories. Data 'ages' off their systems in days or weeks. While archives exist, would-be users often lack the credentials necessary to obtain an account to access or search its contents. Moreover, operational centers and many national archives lack the mandate and the resources to serve non-expert users. The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA), rda.ucar.edu, was created over 40 years ago to collect data for NCAR's internal Big Science projects such as the NCEP/NCAR Reanalysis Project. Over time, the data holdings have grown to 1.8+ Petabytes spanning 600+ datasets. The user base has also grown; in 2014, we served 1.1 Petabytes of data to over 11,000 unique users. The RDA works with national centers, such as NCEP, ECMWF and JMA to make their data available to worldwide audiences and mutually support data access at the production source. We have become not just an open-access data center, but also a data education center. Each dataset archived at the RDA is assigned to a data specialist (DS) who curates the data. If a user has a question not answered in the dataset information web pages prepared by the DS, they can call or email a skilled DS for further clarification. The RDA's diverse staff—with academic training in meteorology, oceanography, engineering (electrical, civil, ocean and database), mathematics, physics, chemistry and information science—means we likely have someone who "speaks your language." Erroneous data assumptions are the Achilles heel of Big Data. It doesn't matter how much data you crunch if the data is not what you think it is. Data discovery is another difficult Big Data problem; one can only solve problems with data if one can find the right data. Metadata, both machine and human-generated, underpin the RDA data search tools. The RDA has stepped in to fill the gap between data

  1. The Coding Process and Its Challenges

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2010-02-01

    Full Text Available Coding is the core process in classic grounded theory methodology. It is through coding that the conceptual abstraction of data and its reintegration as theory takes place. There are two types of coding in a classic grounded theory study: substantive coding, which includes both open and selective coding procedures, and theoretical coding. In substantive coding, the researcher works with the data directly, fracturing and analysing it, initially through open coding for the emergence of a core category and related concepts and then subsequently through theoretical sampling and selective coding of data to theoretically saturate the core and related concepts. Theoretical saturation is achieved through constant comparison of incidents (indicators in the data to elicit the properties and dimensions of each category (code. This constant comparing of incidents continues until the process yields the interchangeability of indicators, meaning that no new properties or dimensions are emerging from continued coding and comparison. At this point, the concepts have achieved theoretical saturation and the theorist shifts attention to exploring the emergent fit of potential theoretical codes that enable the conceptual integration of the core and related concepts to produce hypotheses that account for relationships between the concepts thereby explaining the latent pattern of social behaviour that forms the basis of the emergent theory. The coding of data in grounded theory occurs in conjunction with analysis through a process of conceptual memoing, capturing the theorist’s ideation of the emerging theory. Memoing occurs initially at the substantive coding level and proceeds to higher levels of conceptual abstraction as coding proceeds to theoretical saturation and the theorist begins to explore conceptual reintegration through theoretical coding.

  2. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  3. The Code of Ethics and Editorial Code of Practice of the Royal Astronomical Society

    Science.gov (United States)

    Murdin, Paul

    2013-01-01

    Whilst the Royal Astronomical Society has got by for more than 100 years without a written code of ethics, modern standards of governance suggested that such a code could be useful in the resolution of disputes. In 2005, the RAS adopted the Universal Code of Ethics for Science that had been formulated by the Royal Society of London. At the same time and for similar reasons the RAS adopted an Editorial Code of Practice.

  4. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  5. Generalized optical code construction for enhanced and Modified Double Weight like codes without mapping for SAC-OCDMA systems

    Science.gov (United States)

    Kumawat, Soma; Ravi Kumar, M.

    2016-07-01

    Double Weight (DW) code family is one of the coding schemes proposed for Spectral Amplitude Coding-Optical Code Division Multiple Access (SAC-OCDMA) systems. Modified Double Weight (MDW) code for even weights and Enhanced Double Weight (EDW) code for odd weights are two algorithms extending the use of DW code for SAC-OCDMA systems. The above mentioned codes use mapping technique to provide codes for higher number of users. A new generalized algorithm to construct EDW and MDW like codes without mapping for any weight greater than 2 is proposed. A single code construction algorithm gives same length increment, Bit Error Rate (BER) calculation and other properties for all weights greater than 2. Algorithm first constructs a generalized basic matrix which is repeated in a different way to produce the codes for all users (different from mapping). The generalized code is analysed for BER using balanced detection and direct detection techniques.

  6. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  7. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    International Nuclear Information System (INIS)

    Mori, Takamasa; Nakagawa, Masayuki; Kaneko, Kunio.

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author)

  8. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  9. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Takamasa; Nakagawa, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author).

  10. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  11. Radioactive action code

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    A new coding system, 'Hazrad', for buildings and transportation containers for alerting emergency services personnel to the presence of radioactive materials has been developed in the United Kingdom. The hazards of materials in the buildings or transport container, together with the recommended emergency action, are represented by a number of codes which are marked on the building or container and interpreted from a chart carried as a pocket-size guide. Buildings would be marked with the familiar yellow 'radioactive' trefoil, the written information 'Radioactive materials' and a list of isotopes. Under this the 'Hazrad' code would be written - three symbols to denote the relative radioactive risk (low, medium or high), the biological risk (also low, medium or high) and the third showing the type of radiation emitted, alpha, beta or gamma. The response cards indicate appropriate measures to take, eg for a high biological risk, Bio3, the wearing of a gas-tight protection suit is advised. The code and its uses are explained. (U.K.)

  12. Software information sorting code 'PLUTO-R'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Naraoka, Kenitsu; Adachi, Masao; Takeda, Tatsuoki

    1984-10-01

    A software information sorting code PLUTO-R is developed as one of the supporting codes of the TRITON system for the fusion plasma analysis. The objective of the PLUTO-R code is to sort reference materials of the codes in the TRITON code system. The easiness in the registration of information is especially pursued. As experience and skill in the data registration are not required, this code is usable for construction of general small-scale information system. This report gives an overall description and the user's manual of the PLUTO-R code. (author)

  13. Code development incorporating environmental, safety, and economic aspects of fusion reactors (FY 92--94). Final report

    International Nuclear Information System (INIS)

    Ho, S.K.; Fowler, T.K.; Holdren, J.P.

    1994-01-01

    This is the Final Report for a three-year (FY 92--94) study of the Environmental, Safety, and Economic (ESE) aspects of fusion energy systems, emphasizing development of computerized approaches suitable for incorporation as modules in fusion system design codes. First, as is reported in Section 2, the authors now have operating a simplified but complete environment and safety evaluation code, BESAFE. The first tests of BESAFE as a module of the SUPERCODE, a design optimization systems code at LLNL, are reported in Section 3. Secondly, as reported in Section 4, the authors have maintained a strong effort in developing fast calculational schemes for activation inventory evaluation. In addition to these major accomplishments, considerable progress has been made on research on specific topics as follows. A tritium modeling code TRIDYN was developed in collaboration with the TSTA group at LANL and the Fusion Nuclear Technology group at UCLA. A simplified algorithm has been derived to calculate the transient temperature profiles in the blanket during accidents. The scheme solves iteratively a system of non-linear ordinary differential equations describing about 10 regions of the blanket by preserving energy balance. The authors have studied the physics and engineering aspects of divertor modeling for safety applications. Several modifications in the automation and characterization of environmental and safety indices have been made. They have applied this work to the environmental and safety comparisons of stainless steel with alternative structural materials for fusion reactors. A methodology in decision analysis utilizing influence and decision diagrams has been developed to model fusion reactor design problems. Most of the work during this funding period has been reported in 26 publications including theses, journal publications, conference papers, and technical reports, as listed in Section 11

  14. Quantum computing with Majorana fermion codes

    Science.gov (United States)

    Litinski, Daniel; von Oppen, Felix

    2018-05-01

    We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.

  15. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  16. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  17. Stability analysis by ERATO code

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Matsuura, Toshihiko; Azumi, Masafumi; Kurita, Gen-ichi

    1979-12-01

    Problems in MHD stability calculations by ERATO code are described; which concern convergence property of results, equilibrium codes, and machine optimization of ERATO code. It is concluded that irregularity on a convergence curve is not due to a fault of the ERATO code itself but due to inappropriate choice of the equilibrium calculation meshes. Also described are a code to calculate an equilibrium as a quasi-inverse problem and a code to calculate an equilibrium as a result of a transport process. Optimization of the code with respect to I/O operations reduced both CPU time and I/O time considerably. With the FACOM230-75 APU/CPU multiprocessor system, the performance is about 6 times as high as with the FACOM230-75 CPU, showing the effectiveness of a vector processing computer for the kind of MHD computations. This report is a summary of the material presented at the ERATO workshop 1979(ORNL), supplemented with some details. (author)

  18. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  19. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  20. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  1. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  2. GAMBARAN KONSUMSI ZAT GIZI ANAK UMUR 6 BULAN – 12 TAHUN DI INDONESIA

    Directory of Open Access Journals (Sweden)

    Yekti Widodo

    2013-09-01

    DIETARY INTAKE OF INDONESIAN CHILDREN 6 MONTH - 12 YEAR OF AGE The prevalence of undernutrition in Indonesia is still high. Stunting, one type of undernutrition with the highest prevalence is rank number five in the world. The multisectoral causes of undernutrition include food, health, and caring practices. At individual level, the immediate causes are inadequate and low quality of dietary intake and infectious disease. This SEANUTS study aimed to assess dietary intake among children in Indonesia. The study was conducted in 48 districts covering urban and rural areas of 3,600 children 6 month-12 years of age. Dietary intake was assessed by 1x24 hour dietary recall by trained nutritionists. Indonesian food composition tables were used to calculate nutrient contents and then compared the nutrient intakes to Indonesian recommended dietary allowances (RDA to assess their adequacy. The overall results showed that the average intakes of energy, vitamin A, vitamin C, folate, iron, and calsium and phosphor were still below the RDA (44-77%, while protein and phosphor were above the RDA (106-114%. The inadequacy varies among age group, the older the children the more deficit of nutrient intake. The highest average intake was among  children 6-11 month of age groups and lowest is among children 9-12 year of age. By using cut-off point of Indonesian RDA, there were still high proportion of children deficit in nutrient intakes. It is concluded that children of older age group, living with low maternal education, low socioeconomic status, and in rural area were significantly higher risk of deficit in nutrient intake below RDA. Keywords: nutrient consumption, RDA, Indonesian children

  3. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  4. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  5. Quick response codes in Orthodontics

    Directory of Open Access Journals (Sweden)

    Moidin Shakil

    2015-01-01

    Full Text Available Quick response (QR code codes are two-dimensional barcodes, which encodes for a large amount of information. QR codes in Orthodontics are an innovative approach in which patient details, radiographic interpretation, and treatment plan can be encoded. Implementing QR code in Orthodontics will save time, reduces paperwork, and minimizes manual efforts in storage and retrieval of patient information during subsequent stages of treatment.

  6. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  7. Energy Code Enforcement Training Manual : Covering the Washington State Energy Code and the Ventilation and Indoor Air Quality Code.

    Energy Technology Data Exchange (ETDEWEB)

    Washington State Energy Code Program

    1992-05-01

    This manual is designed to provide building department personnel with specific inspection and plan review skills and information on provisions of the 1991 edition of the Washington State Energy Code (WSEC). It also provides information on provisions of the new stand-alone Ventilation and Indoor Air Quality (VIAQ) Code.The intent of the WSEC is to reduce the amount of energy used by requiring energy-efficient construction. Such conservation reduces energy requirements, and, as a result, reduces the use of finite resources, such as gas or oil. Lowering energy demand helps everyone by keeping electricity costs down. (It is less expensive to use existing electrical capacity efficiently than it is to develop new and additional capacity needed to heat or cool inefficient buildings.) The new VIAQ Code (effective July, 1991) is a natural companion to the energy code. Whether energy-efficient or not, an homes have potential indoor air quality problems. Studies have shown that indoor air is often more polluted than outdoor air. The VIAQ Code provides a means of exchanging stale air for fresh, without compromising energy savings, by setting standards for a controlled ventilation system. It also offers requirements meant to prevent indoor air pollution from building products or radon.

  8. Noncoherent Spectral Optical CDMA System Using 1D Active Weight Two-Code Keying Codes

    Directory of Open Access Journals (Sweden)

    Bih-Chyun Yeh

    2016-01-01

    Full Text Available We propose a new family of one-dimensional (1D active weight two-code keying (TCK in spectral amplitude coding (SAC optical code division multiple access (OCDMA networks. We use encoding and decoding transfer functions to operate the 1D active weight TCK. The proposed structure includes an optical line terminal (OLT and optical network units (ONUs to produce the encoding and decoding codes of the proposed OLT and ONUs, respectively. The proposed ONU uses the modified cross-correlation to remove interferences from other simultaneous users, that is, the multiuser interference (MUI. When the phase-induced intensity noise (PIIN is the most important noise, the modified cross-correlation suppresses the PIIN. In the numerical results, we find that the bit error rate (BER for the proposed system using the 1D active weight TCK codes outperforms that for two other systems using the 1D M-Seq codes and 1D balanced incomplete block design (BIBD codes. The effective source power for the proposed system can achieve −10 dBm, which has less power than that for the other systems.

  9. The reason for having a code of pharmaceutical ethics: Spanish Pharmacists Code of Ethics.

    Science.gov (United States)

    Barreda Hernández, Dolores; Mulet Alberola, Ana; González Bermejo, Diana; Soler Company, Enrique

    2017-05-01

    The pharmacist profession needs its own code of conduct set out in writing to serve as a stimulus to pharmacists in their day-to-day work in the different areas of pharmacy, in conjunction always with each individual pharmacist´s personal commitment to their patients, to other healthcare professionals and to society. An overview is provided of the different codes of ethics for pharmacists on the national and international scale, the most up-to-date code for 2015 being presented as a set of principles which must guide a pharmacutical conduct from the standpoint of deliberative judgment. The difference between codes of ethics and codes of practice is discussed. In the era of massive-scale collaboration, this code is a project holding bright prospects for the future. Each individual pharmacutical attitude in practicing their profession must be identified with the pursuit of excellence in their own personal practice for the purpose of achieving the ethical and professional values above and beyond complying with regulations and code of practice. Copyright AULA MEDICA EDICIONES 2017. Published by AULA MEDICA. All rights reserved.

  10. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  11. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  12. Explicit MDS Codes with Complementary Duals

    DEFF Research Database (Denmark)

    Beelen, Duals Peter; Jin, Lingfei

    2018-01-01

    In 1964, Massey introduced a class of codes with complementary duals which are called Linear Complimentary Dual (LCD for short) codes. He showed that LCD codes have applications in communication system, side-channel attack (SCA) and so on. LCD codes have been extensively studied in literature....... On the other hand, MDS codes form an optimal family of classical codes which have wide applications in both theory and practice. The main purpose of this paper is to give an explicit construction of several classes of LCD MDS codes, using tools from algebraic function fields. We exemplify this construction...

  13. Serial-data correlator/code translator

    Science.gov (United States)

    Morgan, L. E.

    1977-01-01

    System, consisting of sampling flip flop, memory (either RAM or ROM), and memory buffer, correlates sampled data with predetermined acceptance code patterns, translates acceptable code patterns to nonreturn-to-zero code, and identifies data dropouts.

  14. Circular codes revisited: a statistical approach.

    Science.gov (United States)

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  16. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  17. Quality control protocols for radiodiagnosis agents and radiopharmaceuticals

    International Nuclear Information System (INIS)

    Robles, A.; Condor, M.; Caballero, J.; Morote, M.; Garcia, C.; Benites, M.

    1997-01-01

    Based on the compilation of pharmacopoeia methods, literature, manuals and other information developed in our laboratory, protocols have been prepared to carry out quality controls for radiodiagnosis agents (RDA), better known as kits and RDA labelled with Tc99m. Quality control protocols cover physicochemical and biological controls. Physicochemical controls described for RDA include physical characteristics, particle size and number, pH, chemical identification, humidity, tin II; whereas biological controls include sterility, acute toxicity and bacterial endotoxin determination (LAL). Physicochemical controls described for radiopharmaceuticals labelled with Tc99m are pH and radiochemical purity; while biological distribution is described as a biological control

  18. Quality control protocols for radiodiagnosis agents and radiopharmaceuticals; Protocolos de control de calidad para agentes de radiodiagnostico y radiofarmacos

    Energy Technology Data Exchange (ETDEWEB)

    Robles, A; Condor, M; Caballero, J; Morote, M; Garcia, C; Benites, M

    1997-07-01

    Based on the compilation of pharmacopoeia methods, literature, manuals and other information developed in our laboratory, protocols have been prepared to carry out quality controls for radiodiagnosis agents (RDA), better known as kits and RDA labelled with Tc99m. Quality control protocols cover physicochemical and biological controls. Physicochemical controls described for RDA include physical characteristics, particle size and number, pH, chemical identification, humidity, tin II; whereas biological controls include sterility, acute toxicity and bacterial endotoxin determination (LAL). Physicochemical controls described for radiopharmaceuticals labelled with Tc99m are pH and radiochemical purity; while biological distribution is described as a biological control.

  19. Suitability Assessment of Satellite-Derived Drought Indices for Mongolian Grassland

    Directory of Open Access Journals (Sweden)

    Sheng Chang

    2017-06-01

    Full Text Available In Mongolia, drought is a major natural disaster that can influence and devastate large regions, reduce livestock production, cause economic damage, and accelerate desertification in association with destructive human activities. The objective of this article is to determine the optimal satellite-derived drought indices for accurate and real-time expression of grassland drought in Mongolia. Firstly, an adaptability analysis was performed by comparing nine remote sensing-derived drought indices with reference indicators obtained from field observations using several methods (correlation, consistency percentage (CP, and time-space analysis. The reference information included environmental data, vegetation growth status, and region drought-affected (RDA information at diverse scales (pixel, county, and region for three types of land cover (forest steppe, steppe, and desert steppe. Second, a meteorological index (PED, a normalized biomass (NorBio reference indicator, and the RDA-based drought CP method were adopted for describing Mongolian drought. Our results show that in forest steppe regions the normalized difference water index (NDWI is most sensitive to NorBio (maximum correlation coefficient (MAX_R: up to 0.92 and RDA (maximum CP is 87%, and is most consistent with RDA spatial distribution. The vegetation health index (VHI and temperature condition index (TCI are most correlated with the PED index (MAX_R: 0.75 and soil moisture (MAX_R: 0.58, respectively. In steppe regions, the NDWI is most closely related to soil moisture (MAX_R: 0.69 and the VHI is most related to the PED (MAX_R: 0.76, NorBio (MCC: 0.95, and RDA data (maximum CP is 89%, exhibiting the most consistency with RDA spatial distribution. In desert steppe areas, the vegetation condition index (VCI correlates best with NorBio (MAX_R: 0.92, soil moisture (MAX_R: 0.61, and RDA spatial distribution, while TCI correlates best with the PED (MAX_R: 0.75 and the RDA data (maximum CP is 79

  20. Australasian code for reporting of mineral resources and ore reserves (the JORC code)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The latest revision of the Code first published in 1989 becomes effective in September 1999. It was prepared by the Joint Ores Reserves Committee of the Australasian Institute of Mining and Metallurgy, Australian Institute of Geoscientists and Minerals Council of Australia (JORC). It sets out minimum standards, recommendations and guidelines for public reporting of exploration results, mineral resources and ore reserves in Australasia. In this edition, the guidelines, which were previously separated from the Code, have been placed after the respective Code clauses. The Code is applicable to all solid minerals, including diamonds, other gemstones and coal for which public reporting is required by the Australian and New Zealand Stock Exchanges.

  1. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  2. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. The spread of the results in the international calculation amounted to ± 12,000 pcm in the realistic fuel dissolver exercise n degrees 19 proposed by BNFL, and to ± 25,000 pcm in the benchmark n degrees 20 in which fissile material in solid form is surrounded by fissile material in solution. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat latter effect, permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates solicited from the participants. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism (NITAWL in the international SCALE package) to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. Improvements in the up-dated 1990 contributions, as do recent complementary reference calculations (MCNP, VIM, ultrafine slowing-down CGM calculation), confirm the need to use rigorous self-shielding methods in criticality design-oriented codes. 6 refs., 11 figs., 3 tabs

  3. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  4. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M. Jr.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  5. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially-available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  6. A new code for predicting the thermo-mechanical and irradiation behavior of metallic fuels in sodium fast reactors

    Energy Technology Data Exchange (ETDEWEB)

    Karahan, Aydin, E-mail: karahan@mit.ed [Center for Advanced Nuclear Energy Systems, Nuclear Science and Engineering Department, Massachusetts Institute of Technology (United States); Buongiorno, Jacopo [Center for Advanced Nuclear Energy Systems, Nuclear Science and Engineering Department, Massachusetts Institute of Technology (United States)

    2010-01-31

    An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO{sub 2}-PuO{sub 2} mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium

  7. A new code for predicting the thermo-mechanical and irradiation behavior of metallic fuels in sodium fast reactors

    International Nuclear Information System (INIS)

    Karahan, Aydin; Buongiorno, Jacopo

    2010-01-01

    An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO 2 -PuO 2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium reactors

  8. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  9. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  10. Code of Practice containing definitions for Safety Codes of Practice for nuclear power plants

    International Nuclear Information System (INIS)

    1979-01-01

    This Code provides definitions of the technical terms used in the licensing applications to be submitted to the Turkish Atomic Energy Commission (TAEC), in accordance with national licensing regulations. The Code is based mainly on the International Atomic Energy Agency's Code of Practice on the subject. (NEA) [fr

  11. The Minimum Distance of Graph Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2011-01-01

    We study codes constructed from graphs where the code symbols are associated with the edges and the symbols connected to a given vertex are restricted to be codewords in a component code. In particular we treat such codes from bipartite expander graphs coming from Euclidean planes and other...... geometries. We give results on the minimum distances of the codes....

  12. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  13. Quasi-cyclic unit memory convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Paaske, Erik; Ballan, Mark

    1990-01-01

    Unit memory convolutional codes with generator matrices, which are composed of circulant submatrices, are introduced. This structure facilitates the analysis of efficient search for good codes. Equivalences among such codes and some of the basic structural properties are discussed. In particular......, catastrophic encoders and minimal encoders are characterized and dual codes treated. Further, various distance measures are discussed, and a number of good codes, some of which result from efficient computer search and some of which result from known block codes, are presented...

  14. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  15. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  16. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    Science.gov (United States)

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  17. ESCADRE and ICARE code systems

    International Nuclear Information System (INIS)

    Reocreux, M.; Gauvain, J.

    1992-01-01

    The French sever accident code development program is following two parallel approaches: the first one is dealing with ''integral codes'' which are designed for giving immediate engineer answers, the second one is following a more mechanistic way in order to have the capability of detailed analysis of experiments, in order to get a better understanding of the scaling problem and reach a better confidence in plant calculations. In the first approach a complete system has been developed and is being used for practical cases: this is the ESCADRE system. In the second approach, a set of codes dealing first with primary circuit is being developed: a mechanistic core degradation code, ICARE, has been issued and is being coupled with the advanced thermalhydraulic code CATHARE. Fission product codes have been also coupled to CATHARE. The ''integral'' ESCADRE system and the mechanistic ICARE and associated codes are described. Their main characteristics are reviewed and the status of their development and assessment given. Future studies are finally discussed. 36 refs, 4 figs, 1 tab

  18. Clean Code - Why you should care

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    - Martin Fowler Writing code is communication, not solely with the computer that executes it, but also with other developers and with oneself. A developer spends a lot of his working time reading and understanding code that was written by other developers or by himself in the past. The readability of the code plays an important factor for the time to find a bug or add new functionality, which in turn has a big impact on the productivity. Code that is difficult to undestand, hard to maintain and refactor, and offers many spots for bugs to hide is not considered to be "clean code". But what could considered as "clean code" and what are the advantages of a strict application of its guidelines? In this presentation we will take a look on some typical "code smells" and proposed guidelines to improve your coding skills to write cleaner code that is less bug prone and better to maintain.

  19. Burnup code for fuel assembly by Monte Carlo code. MKENO-BURN

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Suyama, Kenya; Masukawa, Fumihiro; Matsumoto, Kiyoshi; Kurosawa, Masayoshi; Kaneko, Toshiyuki.

    1996-12-01

    The evaluation of neutron spectrum is so important for burnup calculation of the heterogeneous geometry like recent BWR fuel assembly. MKENO-BURN is a multi dimensional burnup code that based on the three dimensional monte carlo neutron transport code 'MULTI-KENO' and the routine for the burnup calculation of the one dimensional burnup code 'UNITBURN'. MKENO-BURN analyzes the burnup problem of arbitrary regions after evaluating the neutron spectrum and making one group cross section in three dimensional geometry with MULTI-KENO. It enables us to do three dimensional burnup calculation. This report consists of general description of MKENO-BURN and the input data. (author)

  20. Burnup code for fuel assembly by Monte Carlo code. MKENO-BURN

    Energy Technology Data Exchange (ETDEWEB)

    Naito, Yoshitaka; Suyama, Kenya; Masukawa, Fumihiro; Matsumoto, Kiyoshi; Kurosawa, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Toshiyuki

    1996-12-01

    The evaluation of neutron spectrum is so important for burnup calculation of the heterogeneous geometry like recent BWR fuel assembly. MKENO-BURN is a multi dimensional burnup code that based on the three dimensional monte carlo neutron transport code `MULTI-KENO` and the routine for the burnup calculation of the one dimensional burnup code `UNITBURN`. MKENO-BURN analyzes the burnup problem of arbitrary regions after evaluating the neutron spectrum and making one group cross section in three dimensional geometry with MULTI-KENO. It enables us to do three dimensional burnup calculation. This report consists of general description of MKENO-BURN and the input data. (author)

  1. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  2. An implicit Smooth Particle Hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Charles E. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  3. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  4. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  5. Polarization diversity scheme on spectral polarization coding optical code-division multiple-access network

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chang, Yao-Tang; Chen, Bo-Hau

    2010-12-01

    We present an experiment demonstrating the spectral-polarization coding optical code-division multiple-access system introduced with a nonideal state of polarization (SOP) matching conditions. In the proposed system, the encoding and double balanced-detection processes are implemented using a polarization-diversity scheme. Because of the quasiorthogonality of Hadamard codes combining with array waveguide grating routers and a polarization beam splitter, the proposed codec pair can encode-decode multiple code words of Hadamard code while retaining the ability for multiple-access interference cancellation. The experimental results demonstrate that when the system is maintained with an orthogonal SOP for each user, an effective reduction in the phase-induced intensity noise is obtained. The analytical SNR values are found to overstate the experimental results by around 2 dB when the received effective power is large. This is mainly limited by insertion losses of components and a nonflattened optical light source. Furthermore, the matching conditions can be improved by decreasing nonideal influences.

  6. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  7. Tokamak plasma power balance calculation code (TPC code) outline and operation manual

    International Nuclear Information System (INIS)

    Fujieda, Hirobumi; Murakami, Yoshiki; Sugihara, Masayoshi.

    1992-11-01

    This report is a detailed description on the TPC code, that calculates the power balance of a tokamak plasma according to the ITER guidelines. The TPC code works on a personal computer (Macintosh or J-3100/ IBM-PC). Using input data such as the plasma shape, toroidal magnetic field, plasma current, electron temperature, electron density, impurities and heating power, TPC code can determine the operation point of the fusion reactor (Ion temperature is assumed to be equal to the electron temperature). Supplied flux (Volt · sec) and burn time are also estimated by coil design parameters. Calculated energy confinement time is compared with various L-mode scaling laws and the confinement enhancement factor (H-factor) is evaluated. Divertor heat load is predicted by using simple scaling models (constant-χ, Bohm-type-χ and JT-60U empirical scaling models). Frequently used data can be stored in a 'device file' and used as the default values. TPC code can generate 2-D mesh data and the POPCON plot is drawn by a contour line plotting program (CONPLT). The operation manual about CONPLT code is also described. (author)

  8. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  9. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  10. A new two dimensional spectral/spatial multi-diagonal code for noncoherent optical code division multiple access (OCDMA) systems

    Science.gov (United States)

    Kadhim, Rasim Azeez; Fadhil, Hilal Adnan; Aljunid, S. A.; Razalli, Mohamad Shahrazel

    2014-10-01

    A new two dimensional codes family, namely two dimensional multi-diagonal (2D-MD) codes, is proposed for spectral/spatial non-coherent OCDMA systems based on the one dimensional MD code. Since the MD code has the property of zero cross correlation, the proposed 2D-MD code also has this property. So that, the multi-access interference (MAI) is fully eliminated and the phase induced intensity noise (PIIN) is suppressed with the proposed code. Code performance is analyzed in terms of bit error rate (BER) while considering the effect of shot noise, PIIN, and thermal noise. The performance of the proposed code is compared with the related MD, modified quadratic congruence (MQC), two dimensional perfect difference (2D-PD) and two dimensional diluted perfect difference (2D-DPD) codes. The analytical and the simulation results reveal that the proposed 2D-MD code outperforms the other codes. Moreover, a large number of simultaneous users can be accommodated at low BER and high data rate.

  11. RH-TRU Waste Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2007-07-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  12. Remote-Handled Transuranic Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2006-12-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  13. Optical code division multiple access secure communications systems with rapid reconfigurable polarization shift key user code

    Science.gov (United States)

    Gao, Kaiqiang; Wu, Chongqing; Sheng, Xinzhi; Shang, Chao; Liu, Lanlan; Wang, Jian

    2015-09-01

    An optical code division multiple access (OCDMA) secure communications system scheme with rapid reconfigurable polarization shift key (Pol-SK) bipolar user code is proposed and demonstrated. Compared to fix code OCDMA, by constantly changing the user code, the performance of anti-eavesdropping is greatly improved. The Pol-SK OCDMA experiment with a 10 Gchip/s user code and a 1.25 Gb/s user data of payload has been realized, which means this scheme has better tolerance and could be easily realized.

  14. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  15. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  16. Developments of HTGR thermofluid dynamic analysis codes and HTGR plant dynamic simulation code

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1983-01-01

    In nuclear power plants as well as high temperature gas-cooled reactor plants, the design is mostly performed on the basis of the results after their characteristics have been grasped by carrying out the numerical simulation using the analysis code. Also in Kawasaki Heavy Industries Ltd., on the basis of the system engineering accumulated with gas-cooled reactors since several years ago, the preparation and systematization of analysis codes have been advanced, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In this report, a part of the results is described. The example of the analysis applying the two-dimensional compressible flow analysis codes SOLA-VOF and SALE-2D, which were developed by Los Alamos National Laboratory in USA and modified for use in Kawasaki, to HTGR system is reported. Besides, Kawasaki has developed the control characteristics analyzing code DYSCO by which the change of system composition is easy and high versatility is available. The outline, fundamental equations, fundamental algorithms and examples of application of the SOLA-VOF and SALE-2D, the present status of system characteristic simulation codes and the outline of the DYSCO are described. (Kako, I.)

  17. Interrelations of codes in human semiotic systems.

    OpenAIRE

    Somov, Georgij

    2016-01-01

    Codes can be viewed as mechanisms that enable relations of signs and their components, i.e., semiosis is actualized. The combinations of these relations produce new relations as new codes are building over other codes. Structures appear in the mechanisms of codes. Hence, codes can be described as transformations of structures from some material systems into others. Structures belong to different carriers, but exist in codes in their "pure" form. Building of codes over other codes fosters t...

  18. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  19. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Velkov, K. [GRS, Garching (Germany); Lizorkin, M. [Kurchatov-Institute, Moscow (Russian Federation)] [and others

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  20. R-matrix analysis code (RAC)

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Qi Huiquan

    1990-01-01

    A comprehensive R-matrix analysis code has been developed. It is based on the multichannel and multilevel R-matrix theory and runs in VAX computer with FORTRAN-77. With this code many kinds of experimental data for one nuclear system can be fitted simultaneously. The comparisions between code RAC and code EDA of LANL are made. The data show both codes produced the same calculation results when one set of R-matrix parameters was used. The differential cross section of 10 B (n, α) 7 Li for E n = 0.4 MeV and the polarization of 16 O (n,n) 16 O for E n = 2.56 MeV are presented

  1. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  2. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  3. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.; Bensmail, H.; Yao, N.; Gao, Xin

    2013-01-01

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  4. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas

    2011-08-01

    Fixed-rate and rateless channel codes are generally treated separately in the related research literature and so, a novice in the field inevitably gets the impression that these channel codes are unrelated. By contrast, in this treatise, we endeavor to further develop a link between the traditional fixed-rate codes and the recently developed rateless codes by delving into their underlying attributes. This joint treatment is beneficial for two principal reasons. First, it facilitates the task of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well-understood codedesign tools-originally contrived for fixed-rate codes-to the realm of rateless codes. Indeed, these versatile tools proved to be vital in the design of diverse fixed-rate-coded communications systems, and thus our hope is that they will further elucidate the associated performance ramifications of the rateless coded schemes. © 2011 IEEE.

  5. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  6. A class of Sudan-decodable codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2000-01-01

    In this article, Sudan's algorithm is modified into an efficient method to list-decode a class of codes which can be seen as a generalization of Reed-Solomon codes. The algorithm is specialized into a very efficient method for unique decoding. The code construction can be generalized based...... on algebraic-geometry codes and the decoding algorithms are generalized accordingly. Comparisons with Reed-Solomon and Hermitian codes are made....

  7. Light-water reactor safety analysis codes

    International Nuclear Information System (INIS)

    Jackson, J.F.; Ransom, V.H.; Ybarrondo, L.J.; Liles, D.R.

    1980-01-01

    A brief review of the evolution of light-water reactor safety analysis codes is presented. Included is a summary comparison of the technical capabilities of major system codes. Three recent codes are described in more detail to serve as examples of currently used techniques. Example comparisons between calculated results using these codes and experimental data are given. Finally, a brief evaluation of current code capability and future development trends is presented

  8. LFSC - Linac Feedback Simulation Code

    International Nuclear Information System (INIS)

    Ivanov, Valentin; Fermilab

    2008-01-01

    The computer program LFSC ( ) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output

  9. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  10. The CORSYS neutronics code system

    International Nuclear Information System (INIS)

    Caner, M.; Krumbein, A.D.; Saphier, D.; Shapira, M.

    1994-01-01

    The purpose of this work is to assemble a code package for LWR core physics including coupled neutronics, burnup and thermal hydraulics. The CORSYS system is built around the cell code WIMS (for group microscopic cross section calculations) and 3-dimension diffusion code CITATION (for burnup and fuel management). We are implementing such a system on an IBM RS-6000 workstation. The code was rested with a simplified model of the Zion Unit 2 PWR. (authors). 6 refs., 8 figs., 1 tabs

  11. WWER reactor physics code applications

    International Nuclear Information System (INIS)

    Gado, J.; Kereszturi, A.; Gacs, A.; Telbisz, M.

    1994-01-01

    The coupled steady-state reactor physics and thermohydraulic code system KARATE has been developed and applied for WWER-1000 and WWER-440 operational calculations. The 3 D coupled kinetic code KIKO3D has been developed and validated for WWER-440 accident analysis applications. The coupled kinetic code SMARTA developed by VTT Helsinki has been applied for WWER-440 accident analysis. The paper gives a summary of the experience in code development and application. (authors). 10 refs., 2 tabs., 5 figs

  12. The LIONS code (version 1.0)

    International Nuclear Information System (INIS)

    Bertrand, P.

    1993-01-01

    The new LIONS code (Lancement d'IONS or Ion Launching), a dynamical code implemented in the SPIRaL project for the CIME cyclotron studies, is presented. The various software involves a 3D magnetostatic code, 2D or 3D electrostatic codes for generation of realistic field maps, and several dynamical codes for studying the behaviour of the reference particle from the cyclotron center up to the ejection and for launching particles packets complying with given correlations. Its interactions with the other codes are described. The LIONS code, written in Fortran 90 is already used in studying the CIME cyclotron, from the center to the ejection. It is designed to be used, with minor modifications, in other contexts such as for the simulation of mass spectrometer facilities

  13. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code

    International Nuclear Information System (INIS)

    Ramsthaler, J.A.; Lime, J.F.; Sahota, M.S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A

  14. The code of ethics for nurses.

    Science.gov (United States)

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  15. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  16. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  17. The Coding Question.

    Science.gov (United States)

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Code package to analyse behavior of the WWER fuel rods in normal operation: TOPRA's code

    International Nuclear Information System (INIS)

    Scheglov, A.; Proselkov, V.

    2001-01-01

    This paper briefly describes the code package intended for analysis of WWER fuel rod characteristics. The package includes two computer codes: TOPRA-1 and TOPRA-2 for full-scale fuel rod analyses; MRZ and MKK codes for analyzing the separate sections of fuel rods in r-z and r-j geometry. The TOPRA's codes are developed on the base of PIN-mod2 version and verified against experimental results obtained in MR, MIR and Halden research reactors (in the framework of SOFIT, FGR-2 and FUMEX experimental programs). Comparative analysis of calculation results and results from post-reactor examination of the WWER-440 and WWER-1000 fuel rod are also made as additional verification of these codes. To avoid the enlarging of uncertainties in fuel behavior prediction as a result of simplifying of the fuel geometry, MKK and MRZ codes are developed on the basis of the finite element method with use of the three nodal finite elements. Results obtained in the course of the code verification indicate the possibility for application of the method and TOPRA's code for simplified engineering calculations of WWER fuel rods thermal-physical parameters. An analysis of maximum relative errors for predicting of the fuel rod characteristics in the range of the accepted parameter values is also presented in the paper

  19. Nutrient intake among elderly in southern Peninsular Malaysia.

    Science.gov (United States)

    Suriah, A R; Zainorni, M; Shafawi, S; Mimie Suraya, S; Zarina, N; Wan Zainuddin, W; Zalifah, M

    1996-03-01

    Studies were conducted in selected areas in three states namely Johor (n=117, male=55, female=62), Negeri Sembilan (n=130, male=52, female=78) and Malacca (n=97, male=33, female=64) involving free living elderly (age range from 60 to 93 years old). Respondents were divided into three age cohort groups that is 60 to 69 years, 70 to 79 years and above 80 years old. Assessment of macro and micronutrients were obtained from 24-hour diet recall for three consecutive days. Household measurements were used to estimate the amount of food consumed. Mean energy intake for both sexes were lower than the Malaysian RDA. Mean energy intake were also found to decline with age increment. The percentage of carbohydrate from total calories is higher compared to fat and protein. No respondents were found to consume less than 1/3 RDA for protein. Although no significant difference in nutrient intake was noted among age cohort groups, there was a decline in the intake of protein, fat and carbohydrate. Significantly (p population studied consumed less than 2/3 RDA for vitamin A, thiamine, riboflavin, niacin and calcium. Very low intake of nutrient may lead to many health problems. Overall mean energy intake indicate the respondents consume less than the Malaysian RDA for all three age cohort groups. Total mean energy intake were also found to decline with age increment for both sexes. Due to the low energy intake, higher percentage of elderly were found consuming less than 2/3 RDA for thiamine (65%), riboflavin (63%) and niacin (90%). Other nutrients which were also being consumed less than 2/3 RDA by the respondents are vitamin A (67%) and calcium (65%). The intake of calcium which was found to be extremely low (ranged from 277 to 303 mg) could lead to problems like osteoporosis.

  20. Quantum computation with Turaev-Viro codes

    International Nuclear Information System (INIS)

    Koenig, Robert; Kuperberg, Greg; Reichardt, Ben W.

    2010-01-01

    For a 3-manifold with triangulated boundary, the Turaev-Viro topological invariant can be interpreted as a quantum error-correcting code. The code has local stabilizers, identified by Levin and Wen, on a qudit lattice. Kitaev's toric code arises as a special case. The toric code corresponds to an abelian anyon model, and therefore requires out-of-code operations to obtain universal quantum computation. In contrast, for many categories, such as the Fibonacci category, the Turaev-Viro code realizes a non-abelian anyon model. A universal set of fault-tolerant operations can be implemented by deforming the code with local gates, in order to implement anyon braiding. We identify the anyons in the code space, and present schemes for initialization, computation and measurement. This provides a family of constructions for fault-tolerant quantum computation that are closely related to topological quantum computation, but for which the fault tolerance is implemented in software rather than coming from a physical medium.

  1. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  2. Dual Coding, Reasoning and Fallacies.

    Science.gov (United States)

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  3. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  4. EXAMINE THE RELATIONSHIP OF SOCIO-ECONOMIC STATUS (SES) WITH LEISURE TIME SPENDING OF GIRLS EMPHASIZING SPORTING ACTIVITIES

    OpenAIRE

    Bahyeh Zarei; Mozafar Yektayar

    2016-01-01

    The objective of this research was doing an examination about the relationship of socio-economic status (SES) with leisure time spending in the girls of Sanandaj city emphasizing sporting activities. The method of research was descriptive-correlated and has been done as field research. The population of the research consisted of all young girls of Sanandaj aged between 15-29 years old which 384 samples were selected by using multi-stage cluster sampling. The tools of research were Godrat Nama...

  5. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  6. Emphasizing humanities in medical education: Promoting the integration of medical scientific spirit and medical humanistic spirit.

    Science.gov (United States)

    Song, Peipei; Tang, Wei

    2017-05-23

    In the era of the biological-psychological-social medicine model, an ideal of modern medicine is to enhance the humanities in medical education, to foster medical talents with humanistic spirit, and to promote the integration of scientific spirit and humanistic spirit in medicine. Throughout the United States (US), United Kingdom (UK), other Western countries, and some Asian countries like Japan, many medical universities have already integrated the learning of medical humanities in their curricula and recognized their value. While in China, although medical education reform over the past decade has emphasized the topic of medical humanities to increase the professionalism of future physicians, the integration of medical humanity courses in medical universities has lagged behind the pace in Western countries. In addition, current courses in medical humanities were arbitrarily established due to a lack of organizational independence. For various reasons like a shortage of instructors, medical universities have failed to pay sufficient attention to medical humanities education given the urgent needs of society. The medical problems in contemporary Chinese society are not solely the purview of biomedical technology; what matters more is enhancing the humanities in medical education and fostering medical talents with humanistic spirit. Emphasizing the humanities in medical education and promoting the integration of medical scientific spirit and medical humanistic spirit have become one of the most pressing issues China must address. Greater attention should be paid to reasonable integration of humanities into the medical curriculum, creation of medical courses related to humanities and optimization of the curriculum, and actively allocating abundant teaching resources and exploring better methods of instruction.

  7. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  8. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  9. The 1996 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1996-01-01

    The codes are named 'the Pre-processing' codes, because they are designed to pre-process ENDF/B data, for later, further processing for use in applications. This is a modular set of computer codes, each of which reads and writes evaluated nuclear data in the ENDF/B format. Each code performs one or more independent operations on the data, as described below. These codes are designed to be computer independent, and are presently operational on every type of computer from large mainframe computer to small personal computers, such as IBM-PC and Power MAC. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  10. Tentative design basis new 100 Area water plant embodying a close cooling water circuit

    Energy Technology Data Exchange (ETDEWEB)

    1951-11-14

    The attached document includes a plot plan, flow diagram and delineation of basic assumptions upon which the report was developed. It summarizes the work which has been accomplished to date under RDA No. DC-6 in developing a recirculating water system to serve a new reactor. In order to proceed with the work under RDA No. DC-6 it has been necessary to make certain basic assumptions relative to the primary circuit requirements of RDA No. DC-3. These assumptions are explained in the report and are presented by the exhibits contained therein. Subsequent to the compilation of the basic report certain additional considerations have come to the authors attention and are included in the addendum.

  11. User manual of UNF code

    International Nuclear Information System (INIS)

    Zhang Jingshang

    2001-01-01

    The UNF code (2001 version) written in FORTRAN-90 is developed for calculating fast neutron reaction data of structure materials with incident energies from about 1 Kev up to 20 Mev. The code consists of the spherical optical model, the unified Hauser-Feshbach and exciton model. The man nal of the UNF code is available for users. The format of the input parameter files and the output files, as well as the functions of flag used in UNF code, are introduced in detail, and the examples of the format of input parameters files are given

  12. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  13. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  14. Using Ethical Reasoning to Amplify the Reach and Resonance of Professional Codes of Conduct in Training Big Data Scientists.

    Science.gov (United States)

    Tractenberg, Rochelle E; Russell, Andrew J; Morgan, Gregory J; FitzGerald, Kevin T; Collmann, Jeff; Vinsel, Lee; Steinmann, Michael; Dolling, Lisa M

    2015-12-01

    The use of Big Data--however the term is defined--involves a wide array of issues and stakeholders, thereby increasing numbers of complex decisions around issues including data acquisition, use, and sharing. Big Data is becoming a significant component of practice in an ever-increasing range of disciplines; however, since it is not a coherent "discipline" itself, specific codes of conduct for Big Data users and researchers do not exist. While many institutions have created, or will create, training opportunities (e.g., degree programs, workshops) to prepare people to work in and around Big Data, insufficient time, space, and thought have been dedicated to training these people to engage with the ethical, legal, and social issues in this new domain. Since Big Data practitioners come from, and work in, diverse contexts, neither a relevant professional code of conduct nor specific formal ethics training are likely to be readily available. This normative paper describes an approach to conceptualizing ethical reasoning and integrating it into training for Big Data use and research. Our approach is based on a published framework that emphasizes ethical reasoning rather than topical knowledge. We describe the formation of professional community norms from two key disciplines that contribute to the emergent field of Big Data: computer science and statistics. Historical analogies from these professions suggest strategies for introducing trainees and orienting practitioners both to ethical reasoning and to a code of professional conduct itself. We include two semester course syllabi to strengthen our thesis that codes of conduct (including and beyond those we describe) can be harnessed to support the development of ethical reasoning in, and a sense of professional identity among, Big Data practitioners.

  15. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  16. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...... machinery of algebraic geometry....

  17. New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin

    2017-03-01

    This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.

  18. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  19. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  20. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  1. User's manual for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the User's Manual for the TMAD code system, which includes the TMAD code and the LIBMAKR code. The TMAD code was commissioned to make it easier to interpret moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, also can be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code then will interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data

  2. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  3. Differentially Encoded LDPC Codes—Part II: General Case and Code Optimization

    Directory of Open Access Journals (Sweden)

    Jing Li (Tiffany

    2008-04-01

    Full Text Available This two-part series of papers studies the theory and practice of differentially encoded low-density parity-check (DE-LDPC codes, especially in the context of noncoherent detection. Part I showed that a special class of DE-LDPC codes, product accumulate codes, perform very well with both coherent and noncoherent detections. The analysis here reveals that a conventional LDPC code, however, is not fitful for differential coding and does not, in general, deliver a desirable performance when detected noncoherently. Through extrinsic information transfer (EXIT analysis and a modified “convergence-constraint” density evolution (DE method developed here, we provide a characterization of the type of LDPC degree profiles that work in harmony with differential detection (or a recursive inner code in general, and demonstrate how to optimize these LDPC codes. The convergence-constraint method provides a useful extension to the conventional “threshold-constraint” method, and can match an outer LDPC code to any given inner code with the imperfectness of the inner decoder taken into consideration.

  4. GAMERA - The New Magnetospheric Code

    Science.gov (United States)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  5. Effects of maternal micronutrient supplementation on fetal loss and under-2-years child mortality

    DEFF Research Database (Denmark)

    Andersen, Gregers Stig; Friis, Henrik; Michaelsen, Kim F.

    2010-01-01

    mortality, as compared with iron-folic acid supplementation among 2,100 pregnant women in Guinea-Bissau. Women receiving a 1xRDA MMS preparation (consisting of 14 vitamins and minerals) had a marginally reduced risk of fetal loss (Relative risk (RR) 0.65, 95% CI 0.40; 1.05), and women receiving a 2xRDA MMS...

  6. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  7. Optimal Codes for the Burst Erasure Channel

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  8. Applications guide to the RSIC-distributed version of the MCNP code (coupled Monte Carlo neutron-photon Code)

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-09-01

    An overview of the RSIC-distributed version of the MCNP code (a soupled Monte Carlo neutron-photon code) is presented. All general features of the code, from machine hardware requirements to theoretical details, are discussed. The current nuclide cross-section and other libraries available in the standard code package are specified, and a realistic example of the flexible geometry input is given. Standard and nonstandard source, estimator, and variance-reduction procedures are outlined. Examples of correct usage and possible misuse of certain code features are presented graphically and in standard output listings. Finally, itemized summaries of sample problems, various MCNP code documentation, and future work are given

  9. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  10. Selection on Coding and Regulatory Variation Maintains Individuality in Major Urinary Protein Scent Marks in Wild Mice.

    Directory of Open Access Journals (Sweden)

    Michael J Sheehan

    2016-03-01

    Full Text Available Recognition of individuals by scent is widespread across animal taxa. Though animals can often discriminate chemical blends based on many compounds, recent work shows that specific protein pheromones are necessary and sufficient for individual recognition via scent marks in mice. The genetic nature of individuality in scent marks (e.g. coding versus regulatory variation and the evolutionary processes that maintain diversity are poorly understood. The individual signatures in scent marks of house mice are the protein products of a group of highly similar paralogs in the major urinary protein (Mup gene family. Using the offspring of wild-caught mice, we examine individuality in the major urinary protein (MUP scent marks at the DNA, RNA and protein levels. We show that individuality arises through a combination of variation at amino acid coding sites and differential transcription of central Mup genes across individuals, and we identify eSNPs in promoters. There is no evidence of post-transcriptional processes influencing phenotypic diversity as transcripts accurately predict the relative abundance of proteins in urine samples. The match between transcripts and urine samples taken six months earlier also emphasizes that the proportional relationships across central MUP isoforms in urine is stable. Balancing selection maintains coding variants at moderate frequencies, though pheromone diversity appears limited by interactions with vomeronasal receptors. We find that differential transcription of the central Mup paralogs within and between individuals significantly increases the individuality of pheromone blends. Balancing selection on gene regulation allows for increased individuality via combinatorial diversity in a limited number of pheromones.

  11. Zebra: An advanced PWR lattice code

    Energy Technology Data Exchange (ETDEWEB)

    Cao, L.; Wu, H.; Zheng, Y. [School of Nuclear Science and Technology, Xi' an Jiaotong Univ., No. 28, Xianning West Road, Xi' an, ShannXi, 710049 (China)

    2012-07-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  12. Code of Ethics for Electrical Engineers

    Science.gov (United States)

    Matsuki, Junya

    The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.

  13. Zebra: An advanced PWR lattice code

    International Nuclear Information System (INIS)

    Cao, L.; Wu, H.; Zheng, Y.

    2012-01-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  14. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  15. Lattice polytopes in coding theory

    Directory of Open Access Journals (Sweden)

    Ivan Soprunov

    2015-05-01

    Full Text Available In this paper we discuss combinatorial questions about lattice polytopes motivated by recent results on minimum distance estimation for toric codes. We also include a new inductive bound for the minimum distance of generalized toric codes. As an application, we give new formulas for the minimum distance of generalized toric codes for special lattice point configurations.

  16. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  17. New binary linear codes which are dual transforms of good codes

    NARCIS (Netherlands)

    Jaffe, D.B.; Simonis, J.

    1999-01-01

    If C is a binary linear code, one may choose a subset S of C, and form a new code CST which is the row space of the matrix having the elements of S as its columns. One way of picking S is to choose a subgroup H of Aut(C) and let S be some H-stable subset of C. Using (primarily) this method for

  18. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  19. Code Package to Analyze Parameters of the WWER Fuel Rod. TOPRA-2 Code - Verification Data

    International Nuclear Information System (INIS)

    Scheglov, A.; Proselkov, V.; Passage, G.; Stefanova, S.

    2009-01-01

    Presented are the data for computer codes to analyze WWER fuel rods, used in the WWER department of RRC 'Kurchatov Institute'. Presented is the description of TOPRA-2 code intended for the engineering analysis of thermophysical and strength parameters of the WWER fuel rod - temperature distributions along the fuel radius, gas pressures under the cladding, stresses in the cladding, etc. for the reactor operation in normal conditions. Presented are some results of the code verification against test problems and the data obtained in the experimental programs. Presented are comparison results of the calculations with TOPRA-2 and TRANSURANUS (V1M1J06) codes. Results obtained in the course of verification demonstrate possibility of application of the methodology and TOPRA-2 code for the engineering analysis of the WWER fuel rods

  20. Rate-Compatible Protograph LDPC Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.