WorldWideScience

Sample records for codes fram mga

  1. The FRAM code: Description and some comparisons with MGA

    International Nuclear Information System (INIS)

    Sampson, T.E.; Kelley, T.A.

    1994-01-01

    The authors describe the initial development of the FRAM gamma-ray spectrometry code for analyzing plutonium isotopics, discuss its methodology, and present some comparisons with MGA on identical items. They also present some of the features of a new Windows 3.1-based version (PC/FRAM) and describe some current measurement problems. Development of the FRAM code began in about 1985, growing out of the need at the Los Alamos TA-55 Plutonium Facility for an isotopic analysis code to give accurate results for the effective specific power of heterogeneous (Am/Pu) pyrochemical residues. These residues present a difficult challenge because the americium is present mostly in a low-Z salt matrix (AmCl 3 ) with fines and small pieces of plutonium metal dispersed throughout the salt. Plutonium gamma rays suffer different attenuation than americium gamma rays of the same energy; this makes conventional analysis with a single relative efficiency function inaccurate for Am/Pu ratios and affects the analysis in other subtle ways

  2. Achievements in testing of the MGA and FRAM isotopic software codes under the DOE/NNSA-IRSN cooperation of gamma-ray isotopic measurement systems

    International Nuclear Information System (INIS)

    Vo, Duc; Wang, Tzu-Fang; Funk, Pierre; Weber, Anne-Laure; Pepin, Nicolas; Karcher, Anna

    2009-01-01

    DOE/NNSA and IRSN collaborated on a study of gamma-ray instruments and analysis methods used to perform isotopic measurements of special nuclear materials. The two agencies agreed to collaborate on the project in response to inconsistencies that were found in the various versions of software and hardware used to determine the isotopic abundances of uranium and plutonium. IRSN used software developed internally to test the MGA and FRAM isotopic analysis codes for criteria used to stop data acquisition. The stop-criterion test revealed several unusual behaviors in both the MGA and FRAM software codes.

  3. Comparison of three gamma ray isotopic determination codes: FRAM, MGA, and TRIFID

    International Nuclear Information System (INIS)

    Cremers, T.L.; Malcom, J.E.; Bonner, C.A.

    1994-01-01

    The determination of the isotopic distribution of plutonium and the americium concentration is required for the assay of nuclear material by calorimetry or neutron coincidence counting. The isotopic information is used in calorimetric assay to compute the effective specific power from the measured isotopic fractions and the known specific power of each isotope. The effective specific power is combined with the heat measurement to obtain the mass of plutonium in the assayed nuclear material. The response of neutron coincidence counters is determined by the 240 Pu isotopic fraction with contributions from the other even plutonium isotopes. The effect of the 240 Pu isotopic fraction and the other neutron contributing isotopes are combined as 240 Pu effective. This is used to calculate the mass of nuclear material from the neutron counting data in a manner analogous to the effective specific power in calorimeter. Comparisons of the precision and accuracy of calorimetric assay and neutron coincidence counting often focus only on the precision and accuracy of the heat measurement (calorimetry) compared to the precision and accuracy of the neutron coincidence counting statistics. The major source of uncertainty for both calorimetric assay and neutron coincidence counting often lies in the determination of the plutonium isotopic distribution ad determined by gamma ray spectroscopy. Thus, the selection of the appropriate isotopic distribution code is of paramount importance to good calorimetric assay and neutron coincidence counting. Three gamma ray isotopic distribution codes, FRAM, MGA, and TRIFID have been compared at the Los Alamos Plutonium Facility under carefully controlled conditions of similar count rates, count times, and 240 Pu isotopic fraction

  4. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  5. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dreyer, Jonathan G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wang, Tzu-Fang [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vo, Duc T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, Pierre F. [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France); Weber, Anne-Laure [Inst. for Radiological Protection and Nuclear Safety (IRSN), Fontenay-aux-Roses (France)

    2017-07-20

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4 – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.

  6. Varyasyong Leksikal sa mga Dayalektong Mandaya

    Directory of Open Access Journals (Sweden)

    Dr. Raymund M. Pasion

    2014-12-01

    Full Text Available Layuning panlahat sa pag-aaral na ito na tuklasin ang varyasyong leksikalsa Wikang Mandaya na matatagpuan sa Probinsyang Davao Oriental. Bilang lunsaran sa paglilikom ng mga datos, ginamit ang mga terminong kultural na pangkabuhayan tulad ng pagsasaka, pangangaso, pangingisda at paghahayupan nanababatay sa Indigenous Knowledge System and Practices (IKSP.Sinikap sagutin sa pagaaral ang suliraning ano-anong varyasyong liksikal ang makikita sa mga terminong kultural na pangkabuhayan ng Mandaya na makikita sa munisipalidad ng Caraga, Manay, Bagangaat Cateel? Disenyong kwalitatibo ginamit.Metodong indehinusat deskriptibo naman ang ginamit mula sa paglilikom hanggang sa pag-aanalisa ng mga datos. Samantalang, ang mga impormante ay pinilisa pamamagitan ng kombinasyong purposive at snow-ball sampling. Natuklasan, na ang wikang Mandaya ay nakitaan ng varyasyong lekisikal ayon sa magkakaiba ang anyo, may pagkakatulad ang anyo, at magkakatulad ang anyo subalit magkakaiba ang bigkas. Gayunpaman, pinaniniwalaang dahil sa paktor na heograpikal, sikolohikal at sosyolohikal na nagaganap sa kanilang kultura ay hindi rin maipagkailang nagyari ang varyasyong leksikal na aspekto nito.

  7. Measurement of Plutonium Isotopic Composition - MGA

    Energy Technology Data Exchange (ETDEWEB)

    Vo, Duc Ta [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-21

    In this module, we will use the Canberra InSpector-2000 Multichannel Analyzer with a high-purity germanium detector (HPGe) and the MGA isotopic anlysis software to assay a variety of plutonium samples. The module provides an understanding of the MGA method, its attributes and limitations. You will assess the system performance by measuring a range of materials similar to those you may assay in your work. During the final verification exercise, the results from MGA will be combined with the 240Pueff results from neutron coincidence or multiplicity counters so that measurements of the plutonium mass can be compared with the operator-declared (certified) values.

  8. The Fram Strait integrated ocean observatory

    Science.gov (United States)

    Fahrbach, E.; Beszczynska-Möller, A.; Rettig, S.; Rohardt, G.; Sagen, H.; Sandven, S.; Hansen, E.

    2012-04-01

    A long-term oceanographic moored array has been operated since 1997 to measure the ocean water column properties and oceanic advective fluxes through Fram Strait. While the mooring line along 78°50'N is devoted to monitoring variability of the physical environment, the AWI Hausgarten observatory, located north of it, focuses on ecosystem properties and benthic biology. Under the EU DAMOCLES and ACOBAR projects, the oceanographic observatory has been extended towards the innovative integrated observing system, combining the deep ocean moorings, multipurpose acoustic system and a network of gliders. The main aim of this system is long-term environmental monitoring in Fram Strait, combining satellite data, acoustic tomography, oceanographic measurements at moorings and glider sections with high-resolution ice-ocean circulation models through data assimilation. In future perspective, a cable connection between the Hausgarten observatory and a land base on Svalbard is planned as the implementation of the ESONET Arctic node. To take advantage of the planned cabled node, different technologies for the underwater data transmission were reviewed and partially tested under the ESONET DM AOEM. The main focus was to design and evaluate available technical solutions for collecting data from different components of the Fram Strait ocean observing system, and an integration of available data streams for the optimal delivery to the future cabled node. The main components of the Fram Strait integrated observing system will be presented and the current status of available technologies for underwater data transfer will be reviewed. On the long term, an initiative of Helmholtz observatories foresees the interdisciplinary Earth-Observing-System FRAM which combines observatories such as the long term deep-sea ecological observatory HAUSGARTEN, the oceanographic Fram Strait integrated observing system and the Svalbard coastal stations maintained by the Norwegian ARCTOS network. A vision

  9. Recent improvements in plutonium gamma-ray analysis using MGA

    International Nuclear Information System (INIS)

    Ruhter, W.D.; Gunnink, R.

    1992-06-01

    MGA is a gamma-ray spectrum analysis program for determining relative plutonium isotopic abundances. It can determine plutonium isotopic abundances better than 1% using a high-resolution, low-energy, planar germanium detector and measurement times ten minutes or less. We have modified MGA to allow determination of absolute plutonium isotopic abundances in solutions. With calibration of a detector using a known solution concentration in a well-defined sample geometry, plutonium solution concentrations can be determined. MGA can include analysis of a second spectrum of the high-energy spectrum to include determination of fission product abundances relative to total plutonium. For the high-energy gamma-ray measurements we have devised a new hardware configuration, so that both the low- and high-energy gamma-ray detectors are mounted in a single cryostat thereby reducing weight and volume of the detector systems. We describe the detector configuration, and the performance of the MGA program for determining plutonium concentrations in solutions and fission product abundances

  10. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  11. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  12. The mGA1.0: A common LISP implementation of a messy genetic algorithm

    Science.gov (United States)

    Goldberg, David E.; Kerzic, Travis

    1990-01-01

    Genetic algorithms (GAs) are finding increased application in difficult search, optimization, and machine learning problems in science and engineering. Increasing demands are being placed on algorithm performance, and the remaining challenges of genetic algorithm theory and practice are becoming increasingly unavoidable. Perhaps the most difficult of these challenges is the so-called linkage problem. Messy GAs were created to overcome the linkage problem of simple genetic algorithms by combining variable-length strings, gene expression, messy operators, and a nonhomogeneous phasing of evolutionary processing. Results on a number of difficult deceptive test functions are encouraging with the mGA always finding global optima in a polynomial number of function evaluations. Theoretical and empirical studies are continuing, and a first version of a messy GA is ready for testing by others. A Common LISP implementation called mGA1.0 is documented and related to the basic principles and operators developed by Goldberg et. al. (1989, 1990). Although the code was prepared with care, it is not a general-purpose code, only a research version. Important data structures and global variations are described. Thereafter brief function descriptions are given, and sample input data are presented together with sample program output. A source listing with comments is also included.

  13. FRAM (FRontiers in Arctic marine Monitoring: The FRAM Ocean Observing System) planned efforts for integrated water column biogeochemistry

    Science.gov (United States)

    Nielsdóttir, Maria; Salter, Ian; Kanzow, Torsten; Boetius, Antje

    2015-04-01

    The Arctic is a region undergoing rapid environmental change and will be subject to multiple stressors in the coming decades. Reductions in sea ice concentration; warming, increased terrigenous inputs and Atlantification are all expected to exert a significant impact on the structure and function of Arctic ecosystems. The Fram Strait is a particularly important region because it acts as a gateway in the exchange of Atlantic and Arctic water masses. The logistical constraints in conducting year round biogeochemical measurements in such areas impose a significant limitation to our understanding of these complicated ecosystems. To address these important challenges the German ministry of research has funded a multi-million Euro infrastructure project (FRAM). Over the next five years FRAM will develop a remote access and autonomous sampling infrastructure to improve the temporal and spatial resolution of biogeochemical measurements in the Fram Strait and central Arctic. Here we present a summary of sampling strategies, technological innovations and biogeochemical parameters that will be addressed over the duration of the project. Specific emphasis will be placed on platforms for monitoring nutrient dynamics, carbonate chemistry, organic carbon flux and the development of a sustained microbial observatory.

  14. Mga2 transcription factor regulates an oxygen-responsive lipid homeostasis pathway in fission yeast

    DEFF Research Database (Denmark)

    Burr, Risa; Stewart, Emerson V; Shao, Wei

    2016-01-01

    -binding protein (SREBP) transcription factors regulate lipid homeostasis. In mammals, SREBP-2 controls cholesterol biosynthesis, whereas SREBP-1 controls triacylglycerol and glycerophospholipid biosynthesis. In the fission yeast Schizosaccharomyces pombe, the SREBP-2 homolog Sre1 regulates sterol homeostasis....... In the absence of mga2, fission yeast exhibited growth defects under both normoxia and low oxygen conditions. Mga2 transcriptional targets were enriched for lipid metabolism genes, and mga2Δ cells showed disrupted triacylglycerol and glycerophospholipid homeostasis, most notably with an increase in fatty acid...

  15. A practical MGA-ARIMA model for forecasting real-time dynamic rain-induced attenuation

    Science.gov (United States)

    Gong, Shuhong; Gao, Yifeng; Shi, Houbao; Zhao, Ge

    2013-05-01

    novel and practical modified genetic algorithm (MGA)-autoregressive integrated moving average (ARIMA) model for forecasting real-time dynamic rain-induced attenuation has been established by combining genetic algorithm ideas with the ARIMA model. It is proved that due to the introduction of MGA into the ARIMA(1,1,7) model, the MGA-ARIMA model has the potential to be conveniently applied in every country or area by creating a parameter database used by the ARIMA(1,1,7) model. The parameter database is given in this paper based on attenuation data measured in Xi'an, China. The methods to create the parameter databases in other countries or areas are offered, too. Based on the experimental results, the MGA-ARIMA model has been proved practical for forecasting dynamic rain-induced attenuation in real time. The novel model given in this paper is significant for developing adaptive fade mitigation technologies at millimeter wave bands.

  16. FRAM telescope - monitoring of atmospheric extinction and variable star photometry

    Science.gov (United States)

    Jurysek, J.; Honkova, K.; Masek, M.

    2015-02-01

    The FRAM (F/(Ph)otometric Robotic Atmospheric Monitor) telescope is a part of the Pierre Auger Observatory (PAO) located near town Malargüe in Argentina. The main task of the FRAM telescope is the continuous night - time monitoring of the atmospheric extinction and its wavelength dependence. The current methodology of the measurement of a atmospheric extinction and for instrumentation properties also allows simultaneous observation of other interesting astronomical targets. The current observations of the FRAM telescope are focused on the photometry of eclipsing binaries, positional refinement of minor bodies of the Solar system and observations of optical counterparts of gamma ray bursts. In this contribution, we briefly describe the main purpose of the FRAM telescope for the PAO and we also present its current astrono mical observing program.

  17. Coordinate Regulation of Yeast Sterol Regulatory Element-binding Protein (SREBP) and Mga2 Transcription Factors.

    Science.gov (United States)

    Burr, Risa; Stewart, Emerson V; Espenshade, Peter J

    2017-03-31

    The Mga2 and Sre1 transcription factors regulate oxygen-responsive lipid homeostasis in the fission yeast Schizosaccharomyces pombe in a manner analogous to the mammalian sterol regulatory element-binding protein (SREBP)-1 and SREBP-2 transcription factors. Mga2 and SREBP-1 regulate triacylglycerol and glycerophospholipid synthesis, whereas Sre1 and SREBP-2 regulate sterol synthesis. In mammals, a shared activation mechanism allows for coordinate regulation of SREBP-1 and SREBP-2. In contrast, distinct pathways activate fission yeast Mga2 and Sre1. Therefore, it is unclear whether and how these two related pathways are coordinated to maintain lipid balance in fission yeast. Previously, we showed that Sre1 cleavage is defective in the absence of mga2 Here, we report that this defect is due to deficient unsaturated fatty acid synthesis, resulting in aberrant membrane transport. This defect is recapitulated by treatment with the fatty acid synthase inhibitor cerulenin and is rescued by addition of exogenous unsaturated fatty acids. Furthermore, sterol synthesis inhibition blocks Mga2 pathway activation. Together, these data demonstrate that Sre1 and Mga2 are each regulated by the lipid product of the other transcription factor pathway, providing a source of coordination for these two branches of lipid synthesis. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. Technology development for nuclear material measurement and accountability

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Jong Sook; Lee, Byung Doo; Cha, Hong Ryul; Lee, Yong Duk; Choi, Hyung Nae; Nah, Won Woo; Park, Hoh Joon; Lee, Yung Kil [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-12-01

    The measurement techniques for Pu samples and spent fuel assembly were developed in support of the implementation of national inspection responsibility under the Atomic Energy Act promulgated in 1994 and a computer program was also developed to assess the total nuclear material balance by facility declared records. The results of plutonium isotopic determination by gamma-ray spectrometry with high resolution germanium detector with peak analysis codes (FRAM and MGA codes) were approached to within 1% {approx} 2% of error from chemical analysis values by mass spectrometry. A gamma-ray measurement system for underwater spent nuclear fuels was developed and tested successfully. The falsification of facility and state records can be traced with the help of the developed computer code against declared reports submitted by the concerned state. This activity eventually resulted in finding the discrepancy of accountability records. 18 figs, 20 tabs, 27 refs. (Author).

  19. Technology development for nuclear material measurement and accountability

    International Nuclear Information System (INIS)

    Hong, Jong Sook; Lee, Byung Doo; Cha, Hong Ryul; Lee, Yong Duk; Choi, Hyung Nae; Nah, Won Woo; Park, Hoh Joon; Lee, Yung Kil

    1994-12-01

    The measurement techniques for Pu samples and spent fuel assembly were developed in support of the implementation of national inspection responsibility under the Atomic Energy Act promulgated in 1994 and a computer program was also developed to assess the total nuclear material balance by facility declared records. The results of plutonium isotopic determination by gamma-ray spectrometry with high resolution germanium detector with peak analysis codes (FRAM and MGA codes) were approached to within 1% ∼ 2% of error from chemical analysis values by mass spectrometry. A gamma-ray measurement system for underwater spent nuclear fuels was developed and tested successfully. The falsification of facility and state records can be traced with the help of the developed computer code against declared reports submitted by the concerned state. This activity eventually resulted in finding the discrepancy of accountability records. 18 figs, 20 tabs, 27 refs. (Author)

  20. FRAM Modelling Complex Socio-technical Systems

    CERN Document Server

    Hollnagel, Erik

    2012-01-01

    There has not yet been a comprehensive method that goes behind 'human error' and beyond the failure concept, and various complicated accidents have accentuated the need for it. The Functional Resonance Analysis Method (FRAM) fulfils that need. This book presents a detailed and tested method that can be used to model how complex and dynamic socio-technical systems work, and understand both why things sometimes go wrong but also why they normally succeed.

  1. Conceptual compression discussion on a multi-linear (FTA) and systematic (FRAM) method in an offshore operation's accident modeling.

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza

    2016-12-01

    Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.

  2. Splitting of Atlantic water transport towards the Arctic Ocean into the Fram Strait and Barents Sea Branches - mechanisms and consequences

    Science.gov (United States)

    Beszczynska-Möller, Agnieszka; Skagseth, Øystein; von Appen, Wilken-Jon; Walczowski, Waldemar; Lien, Vidar

    2016-04-01

    The heat content in the Arctic Ocean is to a large extent determined by oceanic advection from the south. During the last two decades the extraordinary warm Atlantic water (AW) inflow has been reported to progress through the Nordic Seas into the Arctic Ocean. Warm anomalies can result from higher air temperatures (smaller heat loss) in the Nordic Seas, and/or from an increased oceanic advection. But the ultimate fate of warm anomalies of Atlantic origin depends strongly on their two possible pathways towards the Arctic Ocean. The AW temperature changes from 7-10°C at the entrance to the Nordic Seas, to 6-6.5°C in the Barents Sea opening and 3-3.5°C as the AW leaving Fram Strait enters the Arctic Ocean. When AW passes through the shallow Barents Sea, nearly all its heat is lost due to atmospheric cooling and AW looses its signature. In the deep Fram Strait the upper part of Atlantic water becomes transformed into a less saline and colder surface layer and thus AW preserves its warm core. A significant warming and high variability of AW volume transport was observed in two recent decades in the West Spitsbergen Current, representing the Fram Strait Branch of Atlantic inflow. The AW inflow through Fram Strait carries between 26 and 50 TW of heat into the Arctic Ocean. While the oceanic heat influx to the Barents Sea is of a similar order, the heat leaving it through the northern exit into the Arctic Ocean is negligible. The relative strength of two Atlantic water branches through Fram Strait and the Barents Sea governs the oceanic heat transport into the Arctic Ocean. According to recently proposed mechanism, the Atlantic water flow in the Barents Sea Branch is controlled by the strength of atmospheric low over the northern Barents Sea, acting through a wind-induced Ekman divergence, which intensifies eastward AW flow. The Atlantic water transport in the Fram Strait Branch is mainly forced by the large-scale low-pressure system over the eastern Norwegian and

  3. Evaluation of Data Retention and Imprint Characteristics of FRAMs Under Environmental Stresses for NASA Applications

    Science.gov (United States)

    Sharma, Asbok K.; Teverovsky, Alexander; Dowdy, Terry W.; Hamilton, Brett

    2002-01-01

    A major reliability issue for all advanced nonvolatile memory (NVM) technology devices including FRAMs is the data retention characteristics over extended period of time, under environmental stresses and exposure to total ionizing dose (TID) radiation effects. For this testing, 256 Kb FRAMs in 28-pin plastic DIPS, rated for industrial grade temperature range of -40 C to +85 C, were procured. These are two-transistor, two-capacitor (2T-2C) design FRAMs. In addition to data retention characteristics, the parts were also evaluated for imprint failures, which are defined as the failure of cells to change from a "preferred" state, where it has been for a significant period of time to an opposite state (e.g., from 1 to 0, or 0 to 1). These 256 K FRAMs were subjected to scanning acoustic microscopy (C-SAM); 1,000 temperature cycles from -65 C to +150 C; high temperature aging at 150 C, 175 C, and 200 C for 1,000 hours; highly accelerated stress test (HAST) for 500 hours; 1,000 hours of operational life test at 125 C; and total ionizing dose radiation testing. As a preconditioning, 10 K read/write cycles were performed on all devices. Interim electrical measurements were performed throughout this characterization, including special imprint testing and final electrical testing. Some failures were observed during high temperature aging test at 200 C, during HAST testing, and during 1,000 hours of operational life at 125 C. The parts passed 10 Krad exposure, but began showing power supply current increases during the dose increment from 10 Krad to 30 Krad, and at 40 Krad severe data retention and parametric failures were observed. Failures from various environmental group testing are currently being analyzed.

  4. Evaluation of Data Retention Characteristics for Ferroelectric Random Access Memories (FRAMs)

    Science.gov (United States)

    Sharma, Ashok K.; Teverovsky, Alexander

    2001-01-01

    Data retention and fatigue characteristics of 64 Kb lead zirconate titanate (PZT)-based Ferroelectric Random Access Memories (FRAMs) microcircuits manufactured by Ramtron were examined over temperature range from -85 C to +310 C for ceramic packaged parts and from -85 C to +175 C for plastic parts, during retention periods up to several thousand hours. Intrinsic failures, which were caused by a thermal degradation of the ferroelectric cells, occurred in ceramic parts after tens or hundreds hours of aging at temperatures above 200 C. The activation energy of the retention test failures was 1.05 eV and the extrapolated mean-time-to-failure (MTTF) at room temperature was estimated to be more than 280 years. Multiple write-read cycling (up to 3x10(exp 7)) during the fatigue testing of plastic and ceramic parts did not result in any parametric or functional failures. However, operational currents linearly decreased with the logarithm of number of cycles thus indicating fatigue process in PZT films. Plastic parts, that had more recent date code as compared to ceramic parts, appeared to be using die with improved process technology and showed significantly smaller changes in operational currents and data access times.

  5. Impact of recirculation on the East Greenland Current in Fram Strait: Results from moored current meter measurements between 1997 and 2009

    NARCIS (Netherlands)

    de Steur, L.; Hansen, E.; Mauritzen, C.; Beszczynska-Möller, A.; Fahrbach, E.

    2014-01-01

    Transports of total volume and water masses obtained from a mooring array in the East Greenland Current (EGC) in Fram Strait are presented for the period 1997–2009. The array in the EGC was moved along isobaths from 79°N to 78°50'N78°50'N in 2002 to line up with moorings in the eastern Fram Strait.

  6. TRISTEN/FRAM II Cruise Report, East Arctic, April 1980.

    Science.gov (United States)

    1981-04-13

    is not readily accessible by air from Alaska. The Eurasia Basin contains the Arctic Midoceanic Ridge, which extends in a straight line for 2000 km...13 6 Bottom Refraction - Shot- Lines Overlain on FRAM II Positions 14 7 Waterfall Display of Successive Spectral Estimates of Single...Northeast leg of the array was oriented 341T and the NW leg 304 ’T. After a windstorm and flow break-up on 16 April, hydrophones 11 and 12 and 21-24 were

  7. Optimization of Nuclear Reactor power Distribution using Genetic Algorithm

    International Nuclear Information System (INIS)

    Kim, Hyu Chan

    1996-02-01

    The main purpose of study is to develop a computer code named as 'MGA-SCOUPE' which can determine an optimal fuel-loading pattern for the nuclear reactor. The developed code, MGA-SCOUPE, automatically lots of searches for the globally optimum solutions based upon the modified Genetic Algorithm(MGA). The optimization goal of the MGA-SCOUPE is (1) the minimization of the deviations in the power peaking factors both at BOC and EOC, and (2) the maximization of the average burnup ration at EOC of the total fuel assemblies. For the reactor core calculation module in the MGA-SCOUPE, the SCOUPE code was partially modified and used. It had been developed originally in MIT and has been used currently in Kyung Hee University. The application of the MGA-SCOUPE to KORI 4-4 Cycle Model show several satisfactory results. Among them, two dominant improvements compared with the SCOUPE code can be summarized as follow: - The MGA-SCOUPE removes the user-dependency problem of the SCOUPE in the optimal loading pattern searches. Therefore, the searching process in the MGA-SCOUPE can be easily automated. - The final fuel loading pattern obtained by the MGA-SCOUPE shows 25.8%, 18.7% reduced standard deviations of the power peaking factors both at BOC and EOC, and 45% increased avg. burnup ratio at EOC compare with those of the SCOUPE

  8. Mga Lente sa Likod ng Lente: Isang Panimulang Pag-aaral ng Ilang Litratong Kuha ni Xander Angeles

    Directory of Open Access Journals (Sweden)

    Moreal Nagarit Camba

    2011-12-01

    Full Text Available Ang pangalang Xander Angeles ay kilala bilang isa sa mga in-demand napangalan sa mundo ng fashion at advertisement sa Pilipinas. Kakabit ngpangalang ito ang samut-saring lokal at internasyunal na parangal; idagdag pa rito ang isang advertising company, isang modelling agency, at isang fashion and photography school.

  9. Delivery and installation of PC/FRAM at the PNC Tokai Works

    International Nuclear Information System (INIS)

    Sampson, T.E.; Kelley, T.A.; Kroncke, K.E.; Menlove, H.O.; Baca, J.; Asano, Takashi; Terakado, Shigeru; Goto, Yasushi; Kogawa, Noboru

    1997-11-01

    The authors report on the assembly, testing, delivery, installation, and initial testing of three PC/FRAM plutonium isotopic analysis systems at the Power Reactor and Nuclear Fuel Development Corporation's Tokai Works. These systems are intended to measure the isotopic composition and 235 U/plutonium of mixed oxide (MOX) waste in 200-L waste drums. These systems provide capability for performing measurements on lead-lined drums

  10. Results from On-Orbit Testing of the Fram Memory Test Experiment on the Fastsat Micro-Satellite

    Science.gov (United States)

    MacLeod, Todd C.; Sims, W. Herb; Varnavas, Kosta A.; Ho, Fat D.

    2011-01-01

    NASA is planning on going beyond Low Earth orbit with manned exploration missions. The radiation environment for most Low Earth orbit missions is harsher than at the Earth's surface but much less harsh than deep space. Development of new electronics is needed to meet the requirements of high performance, radiation tolerance, and reliability. The need for both Volatile and Non-volatile memory has been identified. Emerging Non-volatile memory technologies (FRAM, C-RAM,M-RAM, R-RAM, Radiation Tolerant FLASH, SONOS, etc.) need to be investigated for use in Space missions. An opportunity arose to fly a small memory experiment on a high inclination satellite (FASTSAT). An off-the-shelf 512K Ramtron FRAM was chosen to be tested in the experiment.

  11. Contrasting optical properties of surface waters across the Fram Strait and its potential biological implications

    DEFF Research Database (Denmark)

    Pavlov, Alexey K.; Granskog, Mats A.; Stedmon, Colin A.

    2015-01-01

    radiation (PAR, 400-700nm), but does result in notable differences in ultraviolet (UV) light penetration, with higher attenuation in the EGC. Future changes in the Arctic Ocean system will likely affect EGC through diminishing sea-ice cover and potentially increasing CDOM export due to increase in river......Underwater light regime is controlled by distribution and optical properties of colored dissolved organic matter (CDOM) and particulate matter. The Fram Strait is a region where two contrasting water masses are found. Polar water in the East Greenland Current (EGC) and Atlantic water in the West...... Spitsbergen Current (WSC) differ with regards to temperature, salinity and optical properties. We present data on absorption properties of CDOM and particles across the Fram Strait (along 79° N), comparing Polar and Atlantic surface waters in September 2009 and 2010. CDOM absorption of Polar water in the EGC...

  12. PC/FRAM, Version 3.2 User Manual

    International Nuclear Information System (INIS)

    Kelley, T.A.; Sampson, T.E.

    1999-01-01

    This manual describes the use of version 3.2 of the PC/FRAM plutonium isotopic analysis software developed in the Safeguards Science and Technology Group, NE-5, Nonproliferation and International Security Division Los Alamos National Laboratory. The software analyzes the gamma ray spectrum from plutonium-bearing items and determines the isotopic distribution of the plutonium 241Am content and concentration of other isotopes in the item. The software can also determine the isotopic distribution of uranium isotopes in items containing only uranium. The body of this manual descenies the generic version of the code. Special facility-specific enhancements, if they apply, will be described in the appendices. The information in this manual applies equally well to version 3.3, which has been licensed to ORTEC. The software can analyze data that is stored in a file on disk. It understands several storage formats including Canberra's S1OO format, ORTEC'S 'chn' and 'SPC' formats, and several ASCII text formats. The software can also control data acquisition using an MCA and then store the results in a file on disk for later analysis or analyze the spectrum directly after the acquisition. The software currently only supports the control of ORTEC MCB'S. Support for Canbema's Genie-2000 Spectroscopy Systems will be added in the future. Support for reading and writing CAM files will also be forthcoming. A versatile parameter fde database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration, and detector type. This manual is intended for the system supervisor or the local user who is to be the resident expert. Excerpts from this manual may also be appropriate for the system operator who will routinely use the instrument

  13. The solar and interplanetary causes of the recent minimum in geomagnetic activity (MGA23: a combination of midlatitude small coronal holes, low IMF BZ variances, low solar wind speeds and low solar magnetic fields

    Directory of Open Access Journals (Sweden)

    B. T. Tsurutani

    2011-05-01

    Full Text Available Minima in geomagnetic activity (MGA at Earth at the ends of SC23 and SC22 have been identified. The two MGAs (called MGA23 and MGA22, respectively were present in 2009 and 1997, delayed from the sunspot number minima in 2008 and 1996 by ~1/2–1 years. Part of the solar and interplanetary causes of the MGAs were exceptionally low solar (and thus low interplanetary magnetic fields. Another important factor in MGA23 was the disappearance of equatorial and low latitude coronal holes and the appearance of midlatitude coronal holes. The location of the holes relative to the ecliptic plane led to low solar wind speeds and low IMF (Bz variances (σBz2 and normalized variances (σBz2/B02 at Earth, with concomitant reduced solar wind-magnetospheric energy coupling. One result was the lowest ap indices in the history of ap recording. The results presented here are used to comment on the possible solar and interplanetary causes of the low geomagnetic activity that occurred during the Maunder Minimum.

  14. Recirculation in the Fram Strait and transports of water in and north of the Fram Strait derived from CTD data

    Directory of Open Access Journals (Sweden)

    M. Marnela

    2013-05-01

    Full Text Available The volume, heat and freshwater transports in the Fram Strait are estimated from geostrophic computations based on summer hydrographic data from 1984, 1997, 2002 and 2004. In these years, in addition to the usually sampled section along 79° N, a section between Greenland and Svalbard was sampled further north. Quasi-closed boxes bounded by the two sections and Greenland and Svalbard can then be formed. Applying conservation constraints on these boxes provides barotropic reference velocities. The net volume flux is southward and varies between 2 and 4 Sv. The recirculation of Atlantic water is about 2 Sv. Heat is lost to the atmosphere and the heat loss from the area between the sections averaged over the four years is about 10 TW. The net heat (temperature transport is 20 TW northward into the Arctic Ocean, with large interannual differences. The mean net freshwater added between the sections is 40 mSv and the mean freshwater transport southward across 79° N is less than 60 mSv, indicating that most of the liquid freshwater leaving the Arctic Ocean through Fram Strait in summer is derived from sea ice melt in the northern vicinity of the strait. In 1997, 2001 and 2003 meridional sections along 0° longitude were sampled and in 2003 two smaller boxes can be formed, and the recirculation of Atlantic water in the strait is estimated by geostrophic computations and continuity constraints. The recirculation is weaker close to 80° N than close to 78° N, indicating that the recirculation is mainly confined to the south of 80° N. This is supported by the observations in 1997 and 2001, when only the northern part of the meridional section, from 79° N to 80° N, can be computed with the constraints applied. The recirculation is found strongest close to 79° N.

  15. Impacts of Changed Extratropical Storm Tracks on Arctic Sea Ice Export through Fram Strait

    Science.gov (United States)

    Wei, J.; Zhang, X.; Wang, Z.

    2017-12-01

    Studies have indicated a poleward shift of extratropical storm tracks and intensification of Arctic storm activities, in particular on the North Atlantic side of the Arctic Ocean. To improve understanding of dynamic effect on changes in Arctic sea ice mass balance, we examined the impacts of the changed storm tracks and activities on Arctic sea ice export through Fram Strait through ocean-sea ice model simulations. The model employed is the high-resolution Massachusetts Institute of Technology general circulation model (MITgcm), which was forced by the Japanese 25-year Reanalysis (JRA-25) dataset. The results show that storm-induced strong northerly wind stress can cause simultaneous response of daily sea ice export and, in turn, exert cumulative effects on interannual variability and long-term changes of sea ice export. Further analysis indicates that storm impact on sea ice export is spatially dependent. The storms occurring southeast of Fram Strait exhibit the largest impacts. The weakened intensity of winter storms in this region after 1994/95 could be responsible for the decrease of total winter sea ice export during the same time period.

  16. Impacts of extratropical storm tracks on Arctic sea ice export through Fram Strait

    Science.gov (United States)

    Wei, Jianfen; Zhang, Xiangdong; Wang, Zhaomin

    2018-05-01

    Studies have indicated regime shifts in atmospheric circulation, and associated changes in extratropical storm tracks and Arctic storm activity, in particular on the North Atlantic side of the Arctic Ocean. To improve understanding of changes in Arctic sea ice mass balance, we examined the impacts of the changed storm tracks and cyclone activity on Arctic sea ice export through Fram Strait by using a high resolution global ocean-sea ice model, MITgcm-ECCO2. The model was forced by the Japanese 25-year Reanalysis (JRA-25) dataset. The results show that storm-induced strong northerly wind stress can cause simultaneous response of daily sea ice export and, in turn, exert cumulative effects on interannual variability and long-term changes of sea ice export. Further analysis indicates that storm impact on sea ice export is spatially dependent. The storms occurring southeast of Fram Strait exhibit the largest impacts. The weakened intensity of winter (in this study winter is defined as October-March and summer as April-September) storms in this region after 1994/95 could be responsible for the decrease of total winter sea ice export during the same time period.

  17. FRAM-2012: Norwegians return to the High Arctic with a Hovercraft for Marine Geophysical Research

    Science.gov (United States)

    Hall, J. K.; Kristoffersen, Y.; Brekke, H.; Hope, G.

    2012-12-01

    After four years of testing methods, craft reliability, and innovative equipment, the R/H SABVABAA has embarked on its first FRAM-201x expedition to the highest Arctic. Named after the Inupiaq word for 'flows swiftly over it', the 12m by 6m hovercraft has been home-based in Longyearbyen, Svalbard since June 2008. In this, its fifth summer of work on the ice pack north of 81N, the craft is supported by the Norwegian Petroleum Directorate (NPD) via the Nansen Environmental and Remote Sensing Center (NERSC) in Bergen, and the Norwegian Scientific Academy for Polar Research. FRAM-2012 represents renewed Norwegian interest in returning to the highest Arctic some 116 years after the 1893-96 drift of Fridtjof Nansen's ship FRAM, the first serious scientific investigation of the Arctic. When replenished by air or icebreaker, the hovercraft Sabvabaa offers a hospitable scientific platform with crew of two, capable of marine geophysical, geological and oceanographic observations over long periods with relative mobility on the ice pack. FRAM-2012 is the first step towards this goal, accompanying the Swedish icebreaker ODEN to the Lomonosov Ridge, north of Greenland, as part of the LOMROG III expedition. The science plan called for an initial drive from the ice edge to Gakkel Ridge at 85N where micro-earthquakes would be monitored, and then to continue north to a geological sampling area on the Lomonosov Ridge at about 88N, 65W. The micro-earthquake monitoring is part of Gaute Hope's MSc thesis and entails five hydrophones in a WiFi-connected hydrophone array deployed over the Gakkel Rift Valley, drifting with the ice at up to 0.4 knots. On August 3 the hovercraft was refueled from icebreaker ODEN at 84-21'N and both vessels proceeded north. The progress of the hovercraft was hampered by insufficient visibility for safe driving and time consuming maneuvering in and around larger fields of rubble ice impassable by the hovercraft, but of little concern to the icebreaker. It

  18. The use of Functional Resonance Analysis Method (FRAM) in a mid-air collision to understand some characteristics of the air traffic management system resilience

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues de Carvalho, Paulo Victor, E-mail: paulov@ien.gov.br [National Nuclear Energy Commission/Nuclear Engineering Institute, Cidade Universitaria-Ilha do Fundao, Rio de Janeiro, RJ 21945-970 (Brazil)

    2011-11-15

    The Functional Resonance Analysis Model (FRAM) defines a systemic framework to model complex systems for accident analysis purposes. We use FRAM in the mid-air collision between flight GLO1907, a commercial aircraft Boeing 737-800, and flight N600XL, an executive jet EMBRAER E-145, to investigate key resilience characteristics of the Air Traffic Management System (ATM). This ATM system related accident occurred at 16:56 Brazilian time on September 29, 2006 in the Amazonian sky. FRAM analysis of flight monitoring functions showed system constraints (equipment, training, time, and supervision) that produce variability in system behavior, creating demand resources mismatches in an attempt to perceive and control the developing situation. This variability also included control and coordination breakdowns and automation surprises (TCAS functioning). The analysis showed that under normal variability conditions (without catastrophic failures) the ATM system (pilots, controllers, supervisors, and equipment) was not able to close the control loops of the flight monitoring functions using feedback or feedforward strategies to achieve an adequate control of an aircraft flying in the controlled air space. Our findings shed some light on the resilience of Brazilian ATM system operation and indicated that there is a need of a deeper understanding on how the system is actually functioning. - Highlights: > The Functional Resonance Analysis Model (FRAM) was used in a mid-air collision over Amazon. > The aim was to understand key resilience characteristics of the Air Traffic Management System (ATM). > The analysis showed how, under normal conditions, the system was not able to control flight functions. > The findings shed some light about the resilience of Brazilian ATM system operation.

  19. The use of Functional Resonance Analysis Method (FRAM) in a mid-air collision to understand some characteristics of the air traffic management system resilience

    International Nuclear Information System (INIS)

    Rodrigues de Carvalho, Paulo Victor

    2011-01-01

    The Functional Resonance Analysis Model (FRAM) defines a systemic framework to model complex systems for accident analysis purposes. We use FRAM in the mid-air collision between flight GLO1907, a commercial aircraft Boeing 737-800, and flight N600XL, an executive jet EMBRAER E-145, to investigate key resilience characteristics of the Air Traffic Management System (ATM). This ATM system related accident occurred at 16:56 Brazilian time on September 29, 2006 in the Amazonian sky. FRAM analysis of flight monitoring functions showed system constraints (equipment, training, time, and supervision) that produce variability in system behavior, creating demand resources mismatches in an attempt to perceive and control the developing situation. This variability also included control and coordination breakdowns and automation surprises (TCAS functioning). The analysis showed that under normal variability conditions (without catastrophic failures) the ATM system (pilots, controllers, supervisors, and equipment) was not able to close the control loops of the flight monitoring functions using feedback or feedforward strategies to achieve an adequate control of an aircraft flying in the controlled air space. Our findings shed some light on the resilience of Brazilian ATM system operation and indicated that there is a need of a deeper understanding on how the system is actually functioning. - Highlights: → The Functional Resonance Analysis Model (FRAM) was used in a mid-air collision over Amazon. → The aim was to understand key resilience characteristics of the Air Traffic Management System (ATM). → The analysis showed how, under normal conditions, the system was not able to control flight functions. → The findings shed some light about the resilience of Brazilian ATM system operation.

  20. Atlantic water heat transfer through the Arctic Gateway (Fram Strait) during the Last Interglacial

    Science.gov (United States)

    Zhuravleva, Anastasia; Bauch, Henning A.; Spielhagen, Robert F.

    2017-10-01

    The Last Interglacial in the Arctic region is often described as a time with warmer conditions and significantly less summer sea ice than today. The role of Atlantic water (AW) as the main oceanic heat flux agent into the Arctic Ocean remains, however, unclear. Using high-resolution stable isotope and faunal records from the only deep Arctic Gateway, the Fram Strait, we note for the upper water column a diminished influence of AW and generally colder-than-Holocene surface ocean conditions. After the main Saalian deglaciation had terminated, a first intensification of northward-advected AW happened ( 124 ka). However, an intermittent sea surface cooling, triggered by meltwater release at 122 ka, caused a regional delay in the further development towards peak interglacial conditions. Maximum AW heat advection occurred during late MIS 5e (118.5-116 ka) and interrupted a longer-term cooling trend at the sea surface that started from about 120 ka on. Such a late occurrence of the major AW-derived near-surface warming in the Fram Strait - this is in stark contrast to an early warm peak in the Holocene - compares well in time with upstream records from the Norwegian Sea, altogether implying a coherent development of south-to-north ocean heat transfer through the eastern Nordic Seas and into the high Arctic during the Last Interglacial.

  1. Development of a code for the isotopic analysis of Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2013-05-15

    To strengthen the national nuclear nonproliferation regime by an establishment of nuclear forensic system, the techniques for nuclear material analysis and the categorization of important domestic nuclear materials are being developed. MGAU and FRAM are commercial software for the isotopic analysis of Uranium by using γ-spectroscopy, but the diversity of detection geometry and some effects - self attenuation, coincidence summing, etc. - suggest an analysis tool under continual improvement and modification. Hence, developing another code for HPGe γ- and x-ray spectrum analysis is started in this study. The analysis of the 87-101 keV region of Uranium spectrum is attempted based on the isotopic responses similar to those developed in MGAU. The code for isotopic analysis of Uranium is started from a fitting.

  2. Aerosol Measurements with the FRAM Telescope

    Directory of Open Access Journals (Sweden)

    Ebr Jan

    2017-01-01

    Full Text Available Precision stellar photometry using a telescope equipped with a CCD camera is an obvious way to measure the total aerosol content of the atmosphere as the apparent brightness of every star is affected by scattering. Achieving high precision in the vertical aerosol optical depth (at the level of 0.01 presents a series of interesting challenges. Using 3.5 years of data taken by the FRAM instrument at the Pierre Auger Observatory, we have developed a set of methods and tools to overcome most of these challenges. We use a wide-field camera and measure stars over a large span in airmass to eliminate the need for absolute calibration of the instrument. The main issues for data processing include camera calibration, source identification in curved field, catalog deficiencies, automated aperture photometry in rich fields with lens distortion and corrections for star color. In the next step, we model the airmass-dependence of the extinction and subtract the Rayleigh component of scattering, using laboratory measurements of spectral sensitivity of the device. In this contribution, we focus on the caveats and solutions found during the development of the methods, as well as several issues yet to be solved. Finally, future outlooks, such as the possibility for precision measurements of wavelength dependence of the extinction are discussed.

  3. The spectral optical properties and relative radiant heating contribution of dissolved and particulate matter in the surface waters across the Fram Strait

    DEFF Research Database (Denmark)

    Pavlov, A.K.; Granskog, M.A.; Stedmon, Colin

    autumns of 2009 and 2010 comprehensive observations were performed on transects along 79 N across the Fram Strait. Samples for chromophoric dissolved organic matter (CDOM) and particulate absorption were collected and analyzed together with distribution of temperature and salinity in surface waters (0......-100 m). Large spatial variations in the distribution of CDOM and particulate matter as well as in their relative contributions to total absorption were apparent, with high contrast between waters of Arctic and Atlantic origin. In addition, estimates of underwater light profiles and radiant heating rate...... (RHR) of the upper layer were obtained using a simplistic exponential RHR model. This is one of the first detailed overviews of sea water optical properties across the northern Fram Strait, and might have potential implications for biological, biogeochemical and physical processes in the region...

  4. Characteristics of Milk Fermented by Streptococcus thermophilus MGA45-4 and the Profiles of Associated Volatile Compounds during Fermentation and Storage.

    Science.gov (United States)

    Dan, Tong; Jin, Rulin; Ren, Weiyi; Li, Ting; Chen, Haiyan; Sun, Tiansong

    2018-04-11

    The lactic acid bacterium Streptococcus thermophilus is a major starter culture for the production of dairy products. In this study, the physiochemical characteristics of milk fermented by the MGA45-4 isolate of S. thermophilus were analyzed. Our data indicate that milk fermented using S. thermophilus MGA45-4 maintained a high viable cell count (8.86 log10 colony-forming units/mL), and a relatively high pH (4.4), viscosity (834.33 mPa·s), and water holding capacity (40.85%) during 14 days of storage. By analyzing the volatile compound profile using solid-phase microextraction and gas chromatography/mass spectrometry, we identified 73 volatile compounds in the fermented milk product, including five carboxylic acids, 21 aldehydes, 13 ketones, 16 alcohols, five esters, and 13 aromatic carbohydrates. According to the odor activity values, 11 of these volatile compounds were found to play a key role in producing the characteristic flavor of fermented milk, particularly octanal, nonanal, hexanal, 2,3-butanedione, and 1-octen-3-ol, which had the highest odor activity values among all compounds analyzed. These findings thus provide more insights in the chemical/molecular characteristics of milk fermented using S. thermophilus , which may provide a basis for improving dairy product flavor/odor during the process of fermentation and storage.

  5. Characteristics of Milk Fermented by Streptococcus thermophilus MGA45-4 and the Profiles of Associated Volatile Compounds during Fermentation and Storage

    Directory of Open Access Journals (Sweden)

    Tong Dan

    2018-04-01

    Full Text Available The lactic acid bacterium Streptococcus thermophilus is a major starter culture for the production of dairy products. In this study, the physiochemical characteristics of milk fermented by the MGA45-4 isolate of S. thermophilus were analyzed. Our data indicate that milk fermented using S. thermophilus MGA45-4 maintained a high viable cell count (8.86 log10 colony-forming units/mL, and a relatively high pH (4.4, viscosity (834.33 mPa·s, and water holding capacity (40.85% during 14 days of storage. By analyzing the volatile compound profile using solid-phase microextraction and gas chromatography/mass spectrometry, we identified 73 volatile compounds in the fermented milk product, including five carboxylic acids, 21 aldehydes, 13 ketones, 16 alcohols, five esters, and 13 aromatic carbohydrates. According to the odor activity values, 11 of these volatile compounds were found to play a key role in producing the characteristic flavor of fermented milk, particularly octanal, nonanal, hexanal, 2,3-butanedione, and 1-octen-3-ol, which had the highest odor activity values among all compounds analyzed. These findings thus provide more insights in the chemical/molecular characteristics of milk fermented using S. thermophilus, which may provide a basis for improving dairy product flavor/odor during the process of fermentation and storage.

  6. Restrictions in Mg/Ca-Paleotemperature Estimations in High-Latitude Bottom Waters: Evidence from the Fram Strait and the Nordic Seas

    Science.gov (United States)

    Werner, K.; Marchitto, T. M., Jr.; Not, C.; Spielhagen, R. F.; Husum, K.

    2014-12-01

    Mg to Ca ratios of the benthic foraminifer species Cibicidoides wuellerstorfi provide a great potential for reconstructing bottom water temperatures, especially from the lower end of the temperature range between 0 and 6°C (Tisserand et al., 2013). A set of core top samples from the Fram Strait and the Norwegian margin have been studied for Mg/Ca ratios in C. wuellerstorfi in order to establish a calibration relationship to the environmental conditions. In this part of the northern North Atlantic the bottom water temperature range between -0.5 and -1°C. For the calibration to modern water mass conditions, modern oceanographic data from both existing conductivity-temperature-depth (CTD) casts and the World Ocean Data Base 2013 (Boyer et al., 2013) have been used. Benthic Mg/Ca ratios are relatively high suggesting a preference of C. wuellerstorfi to incorporate Mg below 0°C. Although no correlation has been found to existing temperature calibrations, the data are in line with earlier Mg/Ca data from C. wuellerstorfi in the area (Martin et al., 2002; Elderfield et al., 2006). The carbonate ion effect is most likely a main cause for the relatively high Mg/Ca ratios found in core top samples from the Fram Strait and the Nordic Seas, however, other factors may influence the values as well. Holocene records of benthic trace metal/Ca ratios from the eastern Fram Strait display trends similar to those found in other proxy indicators, despite the difficulties to constrain a temperature calibration for this low temperature range. In particular, the benthic B/Ca and Li/Ca records resemble trends in Holocene planktic foraminifer assemblages, suggesting to be influenced by environmental factors such as the carbonate ion effect consistent for the entire water column.

  7. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    Energy Technology Data Exchange (ETDEWEB)

    Hollnagel, Erik [MINES ParisTech Crisis and Risk Research Centre (CRC), Sophia Antipolis Cedex (France)

    2012-11-15

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc.

  8. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    2012-11-01

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc

  9. Spatial and temporal scales of sea ice protists and phytoplankton distribution from the gateway Fram Strait into the Central Arctic Ocean

    Science.gov (United States)

    Peeken, I.; Hardge, K.; Krumpen, T.; Metfies, K.; Nöthig, E. M.; Rabe, B.; von Appen, W. J.; Vernet, M.

    2016-02-01

    The Arctic Ocean is currently one of the key regions where the effect of climate change is most pronounced. Sea ice is an important interface in this region by representing a unique habitat for many organisms. Massive reduction of sea ice thickness and extent, which have been recorded over the last twenty years, is anticipated to cause large cascading changes in the entire Arctic ecosystem. Most sea ice is formed on the Eurasian shelves and transported via the Transpolardrift to the western Fram Strait and out of the Arctic Ocean with the cold East Greenland Current (EGC). Warm Atlantic water enters the Arctic Ocean with the West Spitsbergen Current (WSC) via eastern Fram Strait. Here, we focus on the spatial spreading of protists from the Atlantic water masses, and their occurrences over the deep basins of the Central Arctic and the relationship amongst them in water and sea ice. Communities were analyzed by using pigments, flow cytometer and ARISA fingerprints during several cruises with the RV Polarstern to the Fram Strait, the Greenland Sea and the Central Arctic Ocean. By comparing these data sets we are able to demonstrate that the origin of the studied sea ice floes is more important for the biodiversity found in the sea ice communities then the respective underlying water mass. In contrast, biodiversity in the water column is mainly governed by the occurring water masses and the presence or absence of sea ice. However, overall the development of standing stocks in both biomes was governed by the availability of nutrients. To get a temporal perspective of the recent results, the study will be embedded in a long-term data set of phytoplankton biomass obtained during several cruises over the last twenty years.

  10. FRAM - the robotic telescope for the monitoring of the wavelength dependence of the extinction: description of hardware, data analysis, and results

    Czech Academy of Sciences Publication Activity Database

    Prouza, Michael; Jelínek, M.; Kubánek, P.; Ebr, Jan; Trávníček, Petr; Šmída, Radomír

    2010-01-01

    Roč. 2010, - (2010), 849382/1-849382/5 ISSN 1687-7969 R&D Projects: GA MŠk LC527; GA MŠk(CZ) LA08016 Institutional research plan: CEZ:AV0Z10100502; CEZ:AV0Z10100523 Keywords : FRAM * wavelength dependence * light extinction * cosmic ray showers Subject RIV: BF - Elementary Particles and High Energy Physics

  11. The compositional change of Fluorescent Dissolved Organic Matter across Fram Strait assessed with use of a multi channel in situ fluorometer.

    Science.gov (United States)

    Raczkowska, A.; Kowalczuk, P.; Sagan, S.; Zabłocka, M.; Pavlov, A. K.; Granskog, M. A.; Stedmon, C. A.

    2016-02-01

    Observations of Colored Dissolved Organic Matter absorption (CDOM) and fluorescence (FDOM) from water samples and an in situ fluorometer and of Inherent Optical Properties (IOP; light absorption and scattering) were carried out along a section across Fram Strait at 79°N. A 3 channel Wetlabs Wetstar fluorometer was deployed, with channels for humic- and protein-like DOM and used to assess distribution of different FDOM fractions. A relationship between fluorescence intensity of the protein-like fraction of FDOM and chlorophyll a fluorescence was found and indicated the importance of phytoplankton biomass in West Spitsbergen Current waters as a significant source of protein-like FDOM. East Greenland Current waters has low concentration of chlorophyll a, and were characterized by high humic-like FDOM fluorescence. An empirical relationship between humic-like FDOM fluorescence intensity and CDOM absorption was derived and confirms the dominance of terrigenous like CDOM on the composition of DOM in the East Greenland Current. These high resolution profile data offer a simple approach to fractionate the contribution of these two DOM source to DOM across the Fram Strait and may help refine estimates of DOC fluxes in and out of the Arctic through this region.

  12. MGA trajectory planning with an ACO-inspired algorithm

    Science.gov (United States)

    Ceriotti, Matteo; Vasile, Massimiliano

    2010-11-01

    Given a set of celestial bodies, the problem of finding an optimal sequence of swing-bys, deep space manoeuvres (DSM) and transfer arcs connecting the elements of the set is combinatorial in nature. The number of possible paths grows exponentially with the number of celestial bodies. Therefore, the design of an optimal multiple gravity assist (MGA) trajectory is a NP-hard mixed combinatorial-continuous problem. Its automated solution would greatly improve the design of future space missions, allowing the assessment of a large number of alternative mission options in a short time. This work proposes to formulate the complete automated design of a multiple gravity assist trajectory as an autonomous planning and scheduling problem. The resulting scheduled plan will provide the optimal planetary sequence and a good estimation of the set of associated optimal trajectories. The trajectory model consists of a sequence of celestial bodies connected by two-dimensional transfer arcs containing one DSM. For each transfer arc, the position of the planet and the spacecraft, at the time of arrival, are matched by varying the pericentre of the preceding swing-by, or the magnitude of the launch excess velocity, for the first arc. For each departure date, this model generates a full tree of possible transfers from the departure to the destination planet. Each leaf of the tree represents a planetary encounter and a possible way to reach that planet. An algorithm inspired by ant colony optimization (ACO) is devised to explore the space of possible plans. The ants explore the tree from departure to destination adding one node at the time: every time an ant is at a node, a probability function is used to select a feasible direction. This approach to automatic trajectory planning is applied to the design of optimal transfers to Saturn and among the Galilean moons of Jupiter. Solutions are compared to those found through more traditional genetic-algorithm techniques.

  13. Water mass distribution in Fram Strait and over the Yermak Plateau in summer 1997

    Directory of Open Access Journals (Sweden)

    B. Rudels

    Full Text Available The water mass distribution in northern Fram Strait and over the Yermak Plateau in summer 1997 is described using CTD data from two cruises in the area. The West Spitsbergen Current was found to split, one part recirculated towards the west, while the other part, on entering the Arctic Ocean separated into two branches. The main inflow of Atlantic Water followed the Svalbard continental slope eastward, while a second, narrower, branch stayed west and north of the Yermak Plateau. The water column above the southeastern flank of the Yermak Plateau was distinctly colder and less saline than the two inflow branches. Immediately west of the outer inflow branch comparatively high temperatures in the Atlantic Layer suggested that a part of the extraordinarily warm Atlantic Water, observed in the boundary current in the Eurasian Basin in the early 1990s, was now returning, within the Eurasian Basin, toward Fram Strait. The upper layer west of the Yermak Plateau was cold, deep and comparably saline, similar to what has recently been observed in the interior Eurasian Basin. Closer to the Greenland continental slope the salinity of the upper layer became much lower, and the temperature maximum of the Atlantic Layer was occasionally below 
    0.5 °C, indicating water masses mainly derived from the Canadian Basin. This implies that the warm pulse of Atlantic Water had not yet made a complete circuit around the Arctic Ocean. The Atlantic Water of the West Spitsbergen Current recirculating within the strait did not extend as far towards Greenland as in the 1980s, leaving a broader passage for waters from the Atlantic and intermediate layers, exiting the Arctic Ocean. A possible interpretation is that the circulation pattern alternates between a strong recirculation of the West Spitsbergen Current in the strait, and a larger exchange of Atlantic Water between the Nordic Seas and the inner parts of the Arctic Ocean.

    Key words: Oceanography: general

  14. Characteristics of colored dissolved organic matter (CDOM) in the Arctic outflow in the Fram Strait: Assessing the changes and fate of terrigenous CDOM in the Arctic Ocean

    NARCIS (Netherlands)

    Granskog, M.A.; Stedmon, C.A.; Dodd, P.A.; Amon, R.M.W.; Pavlov, A.K.; de Steur, L.; Hansen, E.

    2012-01-01

    Absorption coefficients of colored dissolved organic matter (CDOM) were measured together with salinity, delta O-18, and inorganic nutrients across the Fram Strait. A pronounced CDOM absorption maximum between 30 and 120 m depth was associated with river and sea ice brine enriched water,

  15. The coccolithophores Emiliania huxleyi and Coccolithus pelagicus: Extant populations from the Norwegian-Iceland Seas and Fram Strait

    Science.gov (United States)

    Dylmer, C. V.; Giraudeau, J.; Hanquiez, V.; Husum, K.

    2015-04-01

    The distributions of the coccolithophore species Emiliania huxleyi and Coccolithus pelagicus (heterococcolith-bearing phase) in the northern North Atlantic were investigated along two zonal transects crossing Fram Strait and the Norwegian-Iceland Sea, respectively, each conducted during both July 2011 and September-October 2007. Remote-sensing images as well as CTD and ARGO profiles were used to constrain the physico-chemical state of the surface water and surface mixed layer at the time of sampling. Strong seasonal differences in bulk coccolithophore standing stocks characterized the northern and southern transects, where the maximum values of 53×103 cells/l (fall) and 70×103 cells/l (summer), respectively, were essentially explained by E. huxleyi. This pattern confirms previous findings of a summer to fall northwestward shift in peak coccolithophore cell densities within the Nordic Seas. While depicting an overall zonal shift in high cell densities between the summer (Norwegian Sea) and fall (northern Iceland Sea) conditions, the southern transects were additionally characterized by local peak coccolithophore concentrations associated with a geographically and temporally restricted convective process (Lofoten Gyre, summer), as well as an island mass effect (in the vicinity of Jan Mayen Island, fall). Maximum coccolithophore abundances within Fram Strait were found during both seasons close to the western frontal zone (Polar and Arctic Fronts) an area of strong density gradients where physical and chemical properties of the surface mixed layer are prone to enhance phytoplankton biomass and productivity. Here, changes in species dominance from E. huxleyi in summer, to C. pelagicus in fall, were related to the strengthened influence during summer, of surface AW, as well as to high July solar irradiance, within an area usually characterized by C. pelagicus-dominated low density populations.

  16. Biogeographic patterns of bacterial microdiversity in Arctic deep-sea sediments (HAUSGARTEN, Fram Strait).

    Science.gov (United States)

    Buttigieg, Pier Luigi; Ramette, Alban

    2014-01-01

    Marine bacteria colonizing deep-sea sediments beneath the Arctic ocean, a rapidly changing ecosystem, have been shown to exhibit significant biogeographic patterns along transects spanning tens of kilometers and across water depths of several thousand meters (Jacob et al., 2013). Jacob et al. (2013) adopted what has become a classical view of microbial diversity - based on operational taxonomic units clustered at the 97% sequence identity level of the 16S rRNA gene - and observed a very large microbial community replacement at the HAUSGARTEN Long Term Ecological Research station (Eastern Fram Strait). Here, we revisited these data using the oligotyping approach and aimed to obtain new insight into ecological and biogeographic patterns associated with bacterial microdiversity in marine sediments. We also assessed the level of concordance of these insights with previously obtained results. Variation in oligotype dispersal range, relative abundance, co-occurrence, and taxonomic identity were related to environmental parameters such as water depth, biomass, and sedimentary pigment concentration. This study assesses ecological implications of the new microdiversity-based technique using a well-characterized dataset of high relevance for global change biology.

  17. The human intrinsic factor-vitamin B12 receptor, cubilin: molecular characterization and chromosomal mapping of the gene to 10p within the autosomal recessive megaloblastic anemia (MGA1) region

    DEFF Research Database (Denmark)

    Kozyraki, R; Kristiansen, M; Silahtaroglu, A

    1998-01-01

    -5445 on the short arm of chromosome 10. This is within the autosomal recessive megaloblastic anemia (MGA1) 6-cM region harboring the unknown recessive-gene locus of juvenile megaloblastic anemia caused by intestinal malabsorption of cobalamin (Imerslund-Gräsbeck's disease). In conclusion, the present...... molecular and genetic information on human cubilin now provides circumstantial evidence that an impaired synthesis, processing, or ligand binding of cubilin is the molecular background of this hereditary form of megaloblastic anemia. Udgivelsesdato: 1998-May-15...

  18. Introduction to the use of FRAM on the effectiveness assessment of a radiopharmaceutical dispatches process

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Ana G.A.A., E-mail: agaap@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2013-07-01

    This article aims to make an introduction to the use of Functional Resonance Analysis Method (FRAM) on the effectiveness assessment of a specific radiopharmaceutical dispatching process. The main purpose was to provide a didactic view of the method application to further in-depth analysis. The investigation also provided a relevant body of knowledge of radiopharmaceuticals dispatches processes. This work uses the term 'effectiveness assessment' instead of 'risk assessment' due to the broader meaning the former provide. The radiopharmaceutical dispatching process is the final task of a dynamic system designed to attend several medical facilities. It is comprised by functions involving mostly human activities, such as checking and packaging the product and measuring the radiopharmaceutical nuclear activity. Although the dispatch process has well-known steps for its completion, the human factor is the fundamental mechanism of work and control, being susceptible of irregular and instable performance. As a socio-technical system, the risk assessment provided by FRAM may be of importance for safety and quality improvements, even more if considered the nuclear nature of the product, which makes risk assessment critical and mandatory. A system is safe if it is resistant and resilient to perturbations. Identification and assessment of possible risks is, therefore, an essential prerequisite for system safety. Although this seems obvious, most risk assessments are conducted under relative ignorance of the full behavior of the system. Such condition has lead to an approach to assess the risks of intractable systems (i.e., systems that are incompletely described or under specified), namely Resilience Engineering. Into this area, the Functional Resonance Analysis Method has been developed in order to provide concepts, terminology and a set of methods capable of dealing with such systems. The study was conducted following the Functional Resonance Analysis

  19. Introduction to the use of FRAM on the effectiveness assessment of a radiopharmaceutical dispatches process

    International Nuclear Information System (INIS)

    Pereira, Ana G.A.A.

    2013-01-01

    This article aims to make an introduction to the use of Functional Resonance Analysis Method (FRAM) on the effectiveness assessment of a specific radiopharmaceutical dispatching process. The main purpose was to provide a didactic view of the method application to further in-depth analysis. The investigation also provided a relevant body of knowledge of radiopharmaceuticals dispatches processes. This work uses the term 'effectiveness assessment' instead of 'risk assessment' due to the broader meaning the former provide. The radiopharmaceutical dispatching process is the final task of a dynamic system designed to attend several medical facilities. It is comprised by functions involving mostly human activities, such as checking and packaging the product and measuring the radiopharmaceutical nuclear activity. Although the dispatch process has well-known steps for its completion, the human factor is the fundamental mechanism of work and control, being susceptible of irregular and instable performance. As a socio-technical system, the risk assessment provided by FRAM may be of importance for safety and quality improvements, even more if considered the nuclear nature of the product, which makes risk assessment critical and mandatory. A system is safe if it is resistant and resilient to perturbations. Identification and assessment of possible risks is, therefore, an essential prerequisite for system safety. Although this seems obvious, most risk assessments are conducted under relative ignorance of the full behavior of the system. Such condition has lead to an approach to assess the risks of intractable systems (i.e., systems that are incompletely described or under specified), namely Resilience Engineering. Into this area, the Functional Resonance Analysis Method has been developed in order to provide concepts, terminology and a set of methods capable of dealing with such systems. The study was conducted following the Functional Resonance Analysis Method. At first, the

  20. Objev nové ELL proměnné hvězdy v souhvězdí Kentaura a možnost detekce nových exoplanet pomocí dalekohledu FRAM

    Czech Academy of Sciences Publication Activity Database

    Pintr, Pavel; Vápenka, David; Mašek, M.

    2015-01-01

    Roč. 60, č. 2 (2015), s. 65-68 ISSN 0447-6441 R&D Projects: GA MŠk(CZ) LO1206; GA ČR GA13-10365S Institutional support: RVO:61389021 Keywords : variable star * light curve * FRAM * period analysis * exoplanet transit Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics http://jmo.fzu.cz/

  1. Biogeographic patterns of bacterial microdiversity in Arctic deep-sea sediments (Hausgarten, Fram Strait

    Directory of Open Access Journals (Sweden)

    Pier Luigi eButtigieg

    2015-01-01

    Full Text Available Marine bacteria colonising deep-sea sediments beneath the Arctic ocean, a rapidly changing ecosystem, have been shown to exhibit significant biogeographic patterns along transects spanning tens of kilometres and across water depths reaching several thousands of metres (Jacob et al., 2013. Jacob et al. adopted what has become a classical view of microbial diversity based on operational taxonomic units clustered at the 97% sequence identity level of the 16S rRNA gene and observed a very large microbial community replacement at the Hausgarten Long-Term Ecological Research station (Eastern Fram Strait. Here, we revisited these data using the oligotyping approach with the aims of obtaining new insights into ecological and biogeographic patterns associated with bacterial microdiversity in marine sediments and of assessing the level of concordance of these insights with previously obtained results. Variation in oligotype dispersal range, relative abundance, co-occurrence, and taxonomic identity were related to environmental parameters such as water depth, biomass, and sedimentary pigment concentration. This study assesses ecological implications of the new microdiversity-based technique using a well-characterised dataset of high relevance for global change biology.

  2. Innovations in the Assay of Un-Segregated Multi-Isotopic Grade TRU Waste Boxes with SuperHENC and FRAM Technology

    International Nuclear Information System (INIS)

    Simpson, A. P.; Barber, S.; Abdurrahman, N. M.

    2006-01-01

    The Super High Efficiency Neutron Coincidence Counter (SuperHENC) was originally developed by BIL Solutions Inc., Los Alamos National Laboratory (LANL) and Rocky Flats Environmental Technology Site (RFETS) for assay of transuranic (TRU) waste in Standard Waste Boxes (SWB) at Rocky Flats. This mobile system was a key component in the shipment of over 4,000 SWBs to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. The system was WIPP certified in 2001 and operated at the site for four years. The success of this system, a passive neutron coincidence counter combined with high resolution gamma spectroscopy, led to the order of two new units, delivered to Hanford in 2004. Several new challenges were faced at Hanford: For example, the original RFETS system was calibrated for segregated waste streams such that metals, plastics, wet combustibles and dry combustibles were separated by 'Item Description Codes' prior to assay. Furthermore, the RFETS mission of handling only weapons grade plutonium, enabled the original SuperHENC to benefit from the use of known Pu isotopics. Operations at Hanford, as with most other DOE sites, generate un-segregated waste streams, with a wide diversity of Pu isotopics. Consequently, the new SuperHENCs are required to deal with new technical challenges. The neutron system's software and calibration methodology have been modified to encompass these new requirements. In addition, PC-FRAM software has been added to the gamma system, providing a robust isotopic measurement capability. Finally a new software package has been developed that integrates the neutron and gamma data to provide a final assay results and analysis report. The new system's performance has been rigorously tested and validated against WIPP quality requirements. These modifications, together with the mobile platform, make the new SuperHENC far more versatile in handling diverse waste streams and allow for rapid redeployment around the DOE complex. (authors)

  3. ­­­­Submarine Mass Wasting on Hovgaard Ridge, Fram Strait, European Arctic

    Science.gov (United States)

    Forwick, M.; Laberg, J. S.; Husum, K.; Gales, J. A.

    2015-12-01

    Hovgaard Ridge is an 1800 m high bathymetric high in the Fram Strait, the only deep-water gateway between the Arctic Ocean and the other World's oceans. The slopes of the ridge provide evidence of various types of sediment reworking, including 1) up to 12 km wide single and merged slide scars with maximum ~30 m high headwalls and some secondary escarpments; 2) maximum 3 km wide and 130 m deep slide scars with irregular internal morphology, partly narrowing towards the foot of the slope; 3) up to 130 m deep, 1.5 km wide and maximum 8 km long channels/gullies originating from areas of increasing slope angle at the margins of a plateau on top of the ridge. Most slide scars result presumably from retrogressive failure related to weak layers in contourites or ash. The most likely trigger mechanism is seismicity related to tectonic activity within the nearby mid-ocean fracture zone. Gully/channel formation is suggested to result from cascading water masses and/or from sediment gravity flows originating from failure at the slope break after winnowing on the plateau of the ridge.

  4. Characteristics of colored dissolved organic matter (CDOM) in the Arctic outflow in the Fram Strait: Assessing the changes and fate of terrigenous CDOM in the Arctic Ocean

    OpenAIRE

    Granskog, M.A.; Stedmon, C.A.; Dodd, P.A.; Amon, R.M.W.; Pavlov, A.K.; de Steur, L.; Hansen, E.

    2012-01-01

    Absorption coefficients of colored dissolved organic matter (CDOM) were measured together with salinity, delta O-18, and inorganic nutrients across the Fram Strait. A pronounced CDOM absorption maximum between 30 and 120 m depth was associated with river and sea ice brine enriched water, characteristic of the Arctic mixed layer and upper halocline waters in the East Greenland Current (EGC). The lowest CDOM concentrations were found in the Atlantic inflow. We show that the salinity-CDOM relati...

  5. Hvordan kan teknologi skape nye undervisnings- og læringsmåter i fremmedspråksundervisningen fram mot 2030?

    Directory of Open Access Journals (Sweden)

    Eli-Marie Danbolt Drange

    2014-09-01

    Full Text Available I denne artikkelen drøfter jeg hvordan teknologi kan skape nye undervisnings- og læringsmåter i fremmedspråksundervisningen fram mot 2030. Jeg starter med å skissere et mulig framtidsscenario i form av et blogginnlegg skrevet av en 15 åring i året 2030. Videre i artikkelen tar jeg utgangspunkt i dette framtidsscenarioet og sammenligner det med dagens situasjon, i tillegg til at jeg drøfter hva som må til for at scenarioet kan oppfylles. Læreren spiller en nøkkelrolle i utviklingen av nye undervisnings- og læringsmåter, og det er først når læreren integrerer teknologien i sin undervisning at nye praksiser oppstår. Jeg viser noen konkrete eksempler på bruk av teknologi på nye måter, samt refleksjon rundt utviklingen framover.

  6. Mathematical efficiency calibration with uncertain source geometries using smart optimization

    International Nuclear Information System (INIS)

    Menaa, N.; Bosko, A.; Bronson, F.; Venkataraman, R.; Russ, W. R.; Mueller, W.; Nizhnik, V.; Mirolo, L.

    2011-01-01

    The In Situ Object Counting Software (ISOCS), a mathematical method developed by CANBERRA, is a well established technique for computing High Purity Germanium (HPGe) detector efficiencies for a wide variety of source shapes and sizes. In the ISOCS method, the user needs to input the geometry related parameters such as: the source dimensions, matrix composition and density, along with the source-to-detector distance. In many applications, the source dimensions, the matrix material and density may not be well known. Under such circumstances, the efficiencies may not be very accurate since the modeled source geometry may not be very representative of the measured geometry. CANBERRA developed an efficiency optimization software known as 'Advanced ISOCS' that varies the not well known parameters within user specified intervals and determines the optimal efficiency shape and magnitude based on available benchmarks in the measured spectra. The benchmarks could be results from isotopic codes such as MGAU, MGA, IGA, or FRAM, activities from multi-line nuclides, and multiple counts of the same item taken in different geometries (from the side, bottom, top etc). The efficiency optimization is carried out using either a random search based on standard probability distributions, or using numerical techniques that carry out a more directed (referred to as 'smart' in this paper) search. Measurements were carried out using representative source geometries and radionuclide distributions. The radionuclide activities were determined using the optimum efficiency and compared against the true activities. The 'Advanced ISOCS' method has many applications among which are: Safeguards, Decommissioning and Decontamination, Non-Destructive Assay systems and Nuclear reactor outages maintenance. (authors)

  7. Order-disorder transition and electrical conductivity of the brownmillerite solid-solutions system Ba2(In, M)2O5 (M=Ga, Al)

    International Nuclear Information System (INIS)

    Yamamura, Hiroshi; Hamazaki, Hirohumi; Kakinuma, Katsuyoshi; Mori, Toshiyuki; Haneda, Hajime

    1999-01-01

    The brownmillerite solid-solution systems Ba 2 (In 1-x M x ) 2 O 5 (M=Ga, Al) were investigated by means of high-temperature X-ray diffraction (XRD), dilatometry, and electrical-conductivity measurements. XRD showed that the Ba 2 (In 1-x Ga x ) 2 O 5 system had orthorhombic symmetry in the composition range 0.0≤x≤0.2 and cubic symmetry in the range 0.3≤x. The Al system also changed to cubic symmetry from orthorhombic symmetry in the range 0.2≤x. While the orthorhombic phase showed an order-disorder transition in the electrical conductivity measurements, the transition temperature decreased with increasing the M content. The order-disorder transition temperature and the crystal-structure transition temperature were very different. Such a transition was not observed in the cubic phases, and their electrical conductivity were fairly low compared to those of the disordered cubic phase after the transition due to the heating process. These phenomena are discussed in terms of disordering of the tetrahedral site in the brownmillerite structure, which is occupied by the smaller Ga 3+ or Al 3+ rather than ny In 3+

  8. Metaplastic Carcinoma with Chondroid Differentiation Arising in Microglandular Adenosis

    Directory of Open Access Journals (Sweden)

    Ga-Eon Kim

    2017-07-01

    Full Text Available Microglandular adenosis (MGA of the breast is a rare, benign proliferative lesion but with a significant rate of associated carcinoma. Herein, we report an unusual case of metaplastic carcinoma with chondroid differentiation associated with typical MGA. Histologically, MGA showed a direct transition to metaplastic carcinoma without an intervening atypical MGA or ductal carcinoma in situ component. The immunohistochemical profile of the metaplastic carcinoma was mostly similar to that of MGA. In both areas, all the epithelial cells were positive for S-100 protein, but negative for estrogen receptor, progesterone receptor, HER2/neu, and epidermal growth factor receptor. An increase in the Ki-67 and p53 labelling index was observed from MGA to invasive carcinoma. To the best of our knowledge, this is the first case of metaplastic carcinoma with chondroid differentiation arising in MGA in Korea. This case supports the hypothesis that a subset of MGA may be a non-obligate morphologic precursor of breast carcinoma, especially the triple-negative subtype.

  9. Lingon sa Iskolarsyip sa Dulaan (1948-2007 Looking Back on Theater Scholarship (1948-2007

    Directory of Open Access Journals (Sweden)

    Apolonio B. Chua

    2012-12-01

    Full Text Available From 1948 to 2007, the academic community produced about a hundred titles of theses and dissertations on drama and theater, or made use of data from drama and theater for larger spheres of studies. The current article looked into identifying trends and points of emphasis, as the researches and studies progressed through roughly half a century of research production in the academic setting. Inductive in approach and tentative and exploratory in its analysis, the study identified four trends and points of emphasis in research production. In the fifties and sixties, the emphasis was more on studying the play text or drama; studies veered towards a literary reading and orientation. Eventually, this trend gave way to studying the larger phenomenon of mounting, and the mise en scéne and the spectator became additional units of concern for research. Studies began to have sections on props, costumes, and staging techniques. In the eighties, a larger concern for looking at theater as social production followed. Participant observation, field work and ethnography gave equal emphasis on the social context of theater. Marxism and other perspectives from the social sciences framed theater studies then; correlations between theater and society became useful. Towards the last decade of the century, theater studies aimed at a more conceptual approach, emphasizing core concepts like panata and other related or equivalent terms, elevating and defining the study of theater as a study of culture itself. Gamit ang mahigit sa sandaang tesis at disertasyon hinggil sa dula at dulaan o sinasangkot ang mga ito na lumabas sa akademya mula 1948 hanggang 2007, kapwa sa Unibersidad ng Pilipinas at sa iba pa, nilayon ng “Lingon sa Iskolarsyip sa Dulaan (1948-2007” na pulsuhan ang pangkalahatang daloy, tutok, tunguhin o kalakaran sa pagdadala ng mga pag-aaral. Panimula at exploratory sa inductive nitong lapat, nakatukoy ang pag-aaral ng apat na sapit o tutok sa daloy ng

  10. Order-disorder transition and electrical conductivity of the brownmillerite solid-solutions system Ba sub 2 (In, M) sub 2 O sub 5 (M=Ga, Al)

    CERN Document Server

    Yamamura, H; Kakinuma, K; Mori, T; Haneda, H

    1999-01-01

    The brownmillerite solid-solution systems Ba sub 2 (In sub 1 sub - sub x M sub x) sub 2 O sub 5 (M=Ga, Al) were investigated by means of high-temperature X-ray diffraction (XRD), dilatometry, and electrical-conductivity measurements. XRD showed that the Ba sub 2 (In sub 1 sub - sub x Ga sub x) sub 2 O sub 5 system had orthorhombic symmetry in the composition range 0.0<=x<=0.2 and cubic symmetry in the range 0.3<=x. The Al system also changed to cubic symmetry from orthorhombic symmetry in the range 0.2<=x. While the orthorhombic phase showed an order-disorder transition in the electrical conductivity measurements, the transition temperature decreased with increasing the M content. The order-disorder transition temperature and the crystal-structure transition temperature were very different. Such a transition was not observed in the cubic phases, and their electrical conductivity were fairly low compared to those of the disordered cubic phase after the transition due to the heating process. These p...

  11. Findings from the 2012 EBRI/MGA Consumer Engagement in Health Care Survey.

    Science.gov (United States)

    Fronstin, Paul

    2012-12-01

    The 2012 EBRI/MGA Consumer Engagement in Health Care Survey finds continued slow growth in consumer-driven health plans: 10 percent of the population was enrolled in a CDHP, up from 7 percent in 2011. Enrollment in HDHPs remained at 16 percent. Overall, 18.6 million adults ages 21-64 with private insurance, representing 15.4 percent of that market, were either in a CDHP or were in an HDHP that was eligible for an HSA. When their children were counted, about 25 million individuals with private insurance, representing about 14.6 percent of the market, were either in a CDHP or an HSA-eligible plan. This study finds evidence that adults in a CDHP and those in an HDHP were more likely than those in a traditional plan to exhibit a number of cost-conscious behaviors. While CDHP enrollees, HDHP enrollees, and traditional-plan enrollees were about equally likely to report that they made use of quality information provided by their health plan, CDHP enrollees were more likely to use cost information and to try to find information about their doctors' costs and quality from sources other than the health plan. CDHP enrollees were more likely than traditional-plan enrollees to take advantage of various wellness programs, such as health-risk assessments, health-promotion programs, and biometric screenings. In addition, financial incentives mattered more to CDHP enrollees than to traditional-plan enrollees. It is clear that the underlying characteristics of the populations enrolled in these plans are different: Adults in a CDHP were significantly more likely to report being in excellent or very good health. Adults in a CDHP and those in a HDHP were significantly less likely to smoke than were adults in a traditional plan, and they were significantly more likely to exercise. CDHP and HDHP enrollees were also more likely than traditional-plan enrollees to be highly educated. As the CDHP and HDHP markets continue to expand and more enrollees are enrolled for longer periods of time

  12. Diversity and population structure of Marine Group A bacteria in the Northeast subarctic Pacific Ocean.

    Science.gov (United States)

    Allers, Elke; Wright, Jody J; Konwar, Kishori M; Howes, Charles G; Beneze, Erica; Hallam, Steven J; Sullivan, Matthew B

    2013-02-01

    Marine Group A (MGA) is a candidate phylum of Bacteria that is ubiquitous and abundant in the ocean. Despite being prevalent, the structural and functional properties of MGA populations remain poorly constrained. Here, we quantified MGA diversity and population structure in relation to nutrients and O(2) concentrations in the oxygen minimum zone (OMZ) of the Northeast subarctic Pacific Ocean using a combination of catalyzed reporter deposition fluorescence in situ hybridization (CARD-FISH) and 16S small subunit ribosomal RNA (16S rRNA) gene sequencing (clone libraries and 454-pyrotags). Estimates of MGA abundance as a proportion of total bacteria were similar across all three methods although estimates based on CARD-FISH were consistently lower in the OMZ (5.6%±1.9%) than estimates based on 16S rRNA gene clone libraries (11.0%±3.9%) or pyrotags (9.9%±1.8%). Five previously defined MGA subgroups were recovered in 16S rRNA gene clone libraries and five novel subgroups were defined (HF770D10, P262000D03, P41300E03, P262000N21 and A714018). Rarefaction analysis of pyrotag data indicated that the ultimate richness of MGA was very nearly sampled. Spearman's rank analysis of MGA abundances by CARD-FISH and O(2) concentrations resulted in significant correlation. Analyzed in more detail by 16S rRNA pyrotag sequencing, MGA operational taxonomic units affiliated with subgroups Arctic95A-2 and A714018 comprised 0.3-2.4% of total bacterial sequences and displayed strong correlations with decreasing O(2) concentration. This study is the first comprehensive description of MGA diversity using complementary techniques. These results provide a phylogenetic framework for interpreting future studies on ecotype selection among MGA subgroups, and suggest a potentially important role for MGA in the ecology and biogeochemistry of OMZs.

  13. Biogeography of Deep-sea benthic bacteria at regional scale (LTER HAUSGARTEN, Fram Strait, Arctic.

    Directory of Open Access Journals (Sweden)

    Marianne Jacob

    Full Text Available Knowledge on spatial scales of the distribution of deep-sea life is still sparse, but highly relevant to the understanding of dispersal, habitat ranges and ecological processes. We examined regional spatial distribution patterns of the benthic bacterial community and covarying environmental parameters such as water depth, biomass and energy availability at the Arctic Long-Term Ecological Research (LTER site HAUSGARTEN (Eastern Fram Strait. Samples from 13 stations were retrieved from a bathymetric (1,284-3,535 m water depth, 54 km in length and a latitudinal transect (∼ 2,500 m water depth; 123 km in length. 454 massively parallel tag sequencing (MPTS and automated ribosomal intergenic spacer analysis (ARISA were combined to describe both abundant and rare types shaping the bacterial community. This spatial sampling scheme allowed detection of up to 99% of the estimated richness on phylum and class levels. At the resolution of operational taxonomic units (97% sequence identity; OTU3% only 36% of the Chao1 estimated richness was recovered, indicating a high diversity, mostly due to rare types (62% of all OTU3%. Accordingly, a high turnover of the bacterial community was also observed between any two sampling stations (average replacement of 79% of OTU3%, yet no direct correlation with spatial distance was observed within the region. Bacterial community composition and structure differed significantly with increasing water depth along the bathymetric transect. The relative sequence abundance of Verrucomicrobia and Planctomycetes decreased significantly with water depth, and that of Deferribacteres increased. Energy availability, estimated from phytodetrital pigment concentrations in the sediments, partly explained the variation in community structure. Overall, this study indicates a high proportion of unique bacterial types on relatively small spatial scales (tens of kilometers, and supports the sampling design of the LTER site HAUSGARTEN to

  14. Characterization of Nuclear Materials Using Complex of Non-Destructive and Mass-Spectroscopy Methods of Measurements

    International Nuclear Information System (INIS)

    Gorbunova, A.; Kramchaninov, A.

    2015-01-01

    Information and Analytical Centre for nuclear materials investigations was established in Russian Federation in the February 2 of 2009 by ROSATOM State Atomic Energy Corporation (the order #80). Its purpose is in preventing unauthorized access to nuclear materials and excluding their illicit traffic. Information and Analytical Centre includes analytical laboratory to provide composition and properties of nuclear materials of unknown origin for their identification. According to Regulation the Centre deals with: · identification of nuclear materials of unknown origin to provide information about their composition and properties; · arbitration analyzes of nuclear materials; · comprehensive research of nuclear and radioactive materials for developing techniques characterization of materials; · interlaboratory measurements; · measurements for control and accounting; · confirmatory measurements. Complex of non-destructive and mass-spectroscopy techniques was developed for the measurements. The complex consists of: · gamma-ray techniques on the base of MGAU, MGA and FRAM codes for uranium and plutonium isotopic composition; · gravimetrical technique with gamma-spectroscopy in addition for uranium content; · calorimetric technique for plutonium mass; · neutron multiplicity technique for plutonium mass; · measurement technique on the base of mass-spectroscopy for uranium isotopic composition; · measurement technique on the base of mass-spectroscopy for metallic impurities. Complex satisfies the state regulation requirements of ensuring the uniformity of measurements including the Russian Federation Federal Law on Ensuring the Uniformity of Measurements #102-FZ, Interstate Standard GOST R ISO/IEC 17025-2006, National Standards of Russian Federation GOST R 8.563-2009, GOST R 8.703-2010, Federal Regulations NRB-99/2009, OSPORB 99/2010. Created complex is provided in reference materials, equipment end certificated techniques. The complex is included in accredited

  15. Characteristics of colored dissolved organic matter (CDOM) in the Arctic outflow in Fram Strait: assessing the changes and fate of terrigenous CDOM in the Arctic Ocean

    DEFF Research Database (Denmark)

    Granskog, M.A.; Stedmon, Colin; Dodd, P.A.

    2012-01-01

    Absorption coefficients of colored dissolved organic matter (CDOM) were measured together with salinity, δ18O, and inorganic nutrients across the Fram Strait. A pronounced CDOM absorption maximum between 30 and 120 m depth was associated with river and sea ice brine enriched water, characteristic...... of the Arctic mixed layer and upper halocline waters in the East Greenland Current (EGC). The lowest CDOM concentrations were found in the Atlantic inflow. We show that the salinity-CDOM relationship is not suitable for evaluating conservative mixing of CDOM. The strong correlation between meteoric water...... and CDOM is indicative of the riverine/terrigenous origin of CDOM in the EGC. Based on CDOM absorption in Polar Water and comparison with an Arctic river discharge weighted mean, we estimate that a 49–59% integrated loss of CDOM absorption across 250–600 nm has occurred. A preferential removal...

  16. Microglandular adenosis: a prime suspect in triple-negative breast cancer development.

    Science.gov (United States)

    Tsang, Julia Ys; Tse, Gary Mk

    2016-06-01

    Microglandular adenosis (MGA) and atypical MGA (AMGA) are unusual lesions of the breast. They were once regarded as benign proliferative lesions and innocent bystanders. Several lines of evidence suggested that they could be neoplastic, clonal lesions and a non-obligate precursor for triple-negative breast cancers (TNBC). Recent work published in The Journal of Pathology by Guerini-Rocco and colleagues provided further evidence regarding the precursor-product relationship between MGA/AMGA and TNBC. Using a massively parallel sequencing approach, they demonstrated that MGA/AMGA, particularly those associated with TNBC, could be clonal neoplastic lesions showing clonal non-synonymous mutations, but none in pure MGA. Importantly, those alterations were observed in the associated TNBC. They were also able to identify recurrent alterations in TP53 in those MGA/AMGA cases as well as their associated TNBC. The findings, in conjunction with others, underscore the significance for MGA in clinical diagnosis. The potential of a benign lesion to progress into an aggressive malignant tumour implies that modification of the current management approach may be necessary. Copyright © 2016 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd. Copyright © 2016 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.

  17. Integration of radiation protection in occupational health and safety managementsystems - legal requirements and practical realization at the example of the Fraunhofer occupational health and safety management system FRAM

    International Nuclear Information System (INIS)

    Lambotte, S.; Severitt, S.; Weber, U.

    2002-01-01

    The protection of the employees, the people and the environment for the effects of radiation is regulated by numerous laws and rules set by the government and the occupational accident insurances. Primarily these rules apply for the responsibles, normally the employer, as well as for the safety officers. Occupational safety management systems can support these people to carry out their tasks and responsibilities effectively. Also, a systematic handling of the organisation secures that the numerous duties of documentation, time-checking of the proof-lists and dates are respected. Further more, the legal certainty for the responsibles and safety officers will be raised and the occupational, environment, radiation and health protection will be promoted. At the example of the Fraunhofer occupational safety management system (FrAM) it is demonstrated, how radiation protection (ionizing radiation) can be integrated in a progressive intranet supported management system. (orig.)

  18. Pagsusuri sa mga Balyung Nakapaloob sa mga Salawikain ng mga Tiruray Sa South Upi, Maguindanao, Philippines

    Directory of Open Access Journals (Sweden)

    Maria Luz D. Calibayan

    2015-12-01

    Full Text Available This study sought to collect, record, translate, and analyze the Tiruray proverbs. Specifically the study aimed to find out the following: (1 What are the existing proverbs of Tiruray in South Upi, Maguindanao? (2What are the values found as expressed in the proverbs of Tiruraythat have been successfully preserved from their ancestors? The scope of this study was confined to the collected proverbs of Tiruray in South Upi, Maguindanao. The gatheredproverbs were transcribed from Tiruray to Filipino languageand the analysis of values from the translated proverbs was in accordance with the ideas of Andres (1985 and Timbreza (2003. The research design adopted in the study is descriptive content analysis because the main objective of the study was to analyse the values found in Tiruray proverbs. Findings revealed that the Tiruray have rich in oral literary piecessuch as proverbs which are transmitted by word of mouth from generation to generation. The Tiruray proverbs contain differenthuman values that teach or remind people how to live godly lives. Further, Tiruray proverbs conveyed message on how they value peace and harmony in the community for they are peace-loving people. The study of Tiruray proverbs could increase the body of knowledge about the cultural traits of Tiruray who are unique people and have rich cultural heritage.

  19. On basicity and composition of molybdogermanium heteropoly acid

    International Nuclear Information System (INIS)

    Mirzoyan, F.V.; Tarayan, V.M.; Petrosyam, A.A.

    1984-01-01

    The data on the number of single-charged cations of the basic dye (BD) associated by anion of molybdogermanium heteropoly acid equal to effective acid basicity, as well as the data on chemical analysis of formed solid-phase BD-MGA compounds were used to establish that composition and basicity of MGA depend on the nature of dye-precipitator. The use of acridine orange results in stabilization and precipitation of octasubstituted 12MGA salt, and acriflavine-tetrasubstituted 8MGA salt. Formation of the last ones is independent of medium acidity in its wide range: pH 0.45-4.80 and pH 0-3.8 respectively

  20. Findings from the 2009 EBRI/MGA Consumer Engagement in Health Care Survey.

    Science.gov (United States)

    Fronstin, Paul

    2009-12-01

    FIFTH ANNUAL SURVEY: This Issue Brief presents findings from the 2009 EBRI/MGA Consumer Engagement in Health Care Survey, which provides nationally representative data regarding the growth of consumer-driven health plans (CDHPs) and high-deductible health plans (HDHPs), and the impact of these plans and consumer engagement more generally on the behavior and attitudes of adults with private health insurance coverage. Findings from this survey are compared with four earlier annual surveys. ENROLLMENT LOW BUT GROWING: In 2009, 4 percent of the population was enrolled in a CDHP, up from 3 percent in 2008. Enrollment in HDHPs increased from 11 percent in 2008 to 13 percent in 2009. The 4 percent of the population with a CDHP represents 5 million adults ages 21-64 with private insurance, while the 13 percent with a HDHP represents 16.2 million people. Among the 16.2 million individuals with an HDHP, 38 percent (or 6.2 million) reported that they were eligible for a health savings account (HSA) but did not have such an account. Overall, 11.2 million adults ages 21-64 with private insurance, representing 8.9 percent of that market, were either in a CDHP or were in an HDHP that was eligible for an HSA, but had not opened the account. MORE COST-CONSCIOUS BEHAVIOR: Individuals in CDHPs were more likely than those with traditional coverage to exhibit a number of cost-conscious behaviors. They were more likely to say that they had checked whether the plan would cover care; asked for a generic drug instead of a brand name; talked to their doctor about prescription drug options, other treatments, and costs; asked their doctor to recommend a less costly prescription drug; developed a budget to manage health care expenses; checked prices before getting care; and used an online cost-tracking tool. CDHP MORE ENGAGED IN WELLNESS PROGRAMS: CDHP enrollees were more likely than traditional plan enrollees to report that they had the opportunity to fill out a health risk assessment

  1. Labeling RNAs in Live Cells Using Malachite Green Aptamer Scaffolds as Fluorescent Probes.

    Science.gov (United States)

    Yerramilli, V Siddartha; Kim, Kyung Hyuk

    2018-03-16

    RNAs mediate many different processes that are central to cellular function. The ability to quantify or image RNAs in live cells is very useful in elucidating such functions of RNA. RNA aptamer-fluorogen systems have been increasingly used in labeling RNAs in live cells. Here, we use the malachite green aptamer (MGA), an RNA aptamer that can specifically bind to malachite green (MG) dye and induces it to emit far-red fluorescence signals. Previous studies on MGA showed a potential for the use of MGA for genetically tagging other RNA molecules in live cells. However, these studies also exhibited low fluorescence signals and high background noise. Here we constructed and tested RNA scaffolds containing multiple tandem repeats of MGA as a strategy to increase the brightness of the MGA aptamer-fluorogen system as well as to make the system fluoresce when tagging various RNA molecules, in live cells. We demonstrate that our MGA scaffolds can induce fluorescence signals by up to ∼20-fold compared to the basal level as a genetic tag for other RNA molecules. We also show that our scaffolds function reliably as genetically encoded fluorescent tags for mRNAs of fluorescent proteins and other RNA aptamers.

  2. Diversity and population structure of Marine Group A bacteria in the Northeast subarctic Pacific Ocean

    OpenAIRE

    Allers, Elke; Wright, Jody J; Konwar, Kishori M; Howes, Charles G; Beneze, Erica; Hallam, Steven J; Sullivan, Matthew B

    2012-01-01

    Marine Group A (MGA) is a candidate phylum of Bacteria that is ubiquitous and abundant in the ocean. Despite being prevalent, the structural and functional properties of MGA populations remain poorly constrained. Here, we quantified MGA diversity and population structure in relation to nutrients and O2 concentrations in the oxygen minimum zone (OMZ) of the Northeast subarctic Pacific Ocean using a combination of catalyzed reporter deposition fluorescence in situ hybridization (CARD-FISH) and ...

  3. Function and dynamics of aptamers: A case study on the malachite green aptamer

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Tianjiao [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Aptamers are short single-stranded nucleic acids that can bind to their targets with high specificity and high affinity. To study aptamer function and dynamics, the malachite green aptamer was chosen as a model. Malachite green (MG) bleaching, in which an OH- attacks the central carbon (C1) of MG, was inhibited in the presence of the malachite green aptamer (MGA). The inhibition of MG bleaching by MGA could be reversed by an antisense oligonucleotide (AS) complementary to the MGA binding pocket. Computational cavity analysis of the NMR structure of the MGA-MG complex predicted that the OH- is sterically excluded from the C1 of MG. The prediction was confirmed experimentally using variants of the MGA with changes in the MG binding pocket. This work shows that molecular reactivity can be reversibly regulated by an aptamer-AS pair based on steric hindrance. In addition to demonstrate that aptamers could control molecular reactivity, aptamer dynamics was studied with a strategy combining molecular dynamics (MD) simulation and experimental verification. MD simulation predicted that the MG binding pocket of the MGA is largely pre-organized and that binding of MG involves reorganization of the pocket and a simultaneous twisting of the MGA terminal stems around the pocket. MD simulation also provided a 3D-structure model of unoccupied MGA that has not yet been obtained by biophysical measurements. These predictions were consistent with biochemical and biophysical measurements of the MGA-MG interaction including RNase I footprinting, melting curves, thermodynamic and kinetic constants measurement. This work shows that MD simulation can be used to extend our understanding of the dynamics of aptamer-target interaction which is not evident from static 3D-structures. To conclude, I have developed a novel concept to control molecular reactivity by an aptamer based on steric protection and a strategy to study the dynamics of aptamer-target interaction by combining MD

  4. Findings from the 2011 EBRI/MGA Consumer Engagement in Health Care Survey.

    Science.gov (United States)

    Fronstin, Paul

    2011-12-01

    SEVENTH ANNUAL SURVEY: This Issue Brief presents findings from the 2011 EBRI/MGA Consumer Engagement in Health Care Survey. This study is based on an online survey of 4,703 privately insured adults ages 21-64 to provide nationally representative data regarding the growth of consumer-driven health plans (CDHPs) and high-deductible health plans (HDHPs), and the impact of these plans and consumer engagement more generally on the behavior and attitudes of adults with private health insurance coverage. Findings from this survey are compared with EBRI's findings from earlier surveys. ENROLLMENT CONTINUES TO GROW: The survey finds continued growth in consumer-driven health plans: In 2011, 7 percent of the population was enrolled in a CDHP, up from 5 percent in 2010. Enrollment in HDHPs increased from 14 percent in 2010 to 16 percent in 2011. The 7 percent of the population with a CDHP represents 8.4 million adults ages 21-64 with private insurance, while the 16 percent with a HDHP represents 19.3 million people. Among the 19.3 million individuals with an HDHP, 38 percent (or 7.3 million) reported that they were eligible for a health savings ccount (HSA) but did not have such an account. Overall, 15.8 million adults ages 21-64 with private insurance, representing 13.1 percent of that market, were either in a CDHP or were in an HDHP that was eligible for an HSA but had not opened the account. When their children are counted, about 21 million individuals with private insurance, representing about 12 percent of the market, were either in a CDHP or an HSA-eligible plan. MORE COST-CONSCIOUS BEHAVIOR: Individuals in CDHPs were more likely than those with traditional coverage to exhibit a number of cost-conscious behaviors. They were more likely to say that they had checked whether their plan would cover care; asked for a generic drug instead of a brand name; talked to their doctor about treatment options and costs; talked to their doctor about prescription drug options and costs

  5. Studies on entrained DNPPA separation by charcoal adsorption from aqueous solutions generated during uranium recovery from strong phosphoric acid

    International Nuclear Information System (INIS)

    Singh, D.K.; Vijayalakshmi, R.; Singh, H.

    2010-01-01

    During the separation of metal ions by solvent extraction technique in hydrometallurgical operations, organic solvents either get entrained or dissolved in various types of aqueous streams, which need to be separated out to prevent environmental pollution and solvent loss. Generally entrained solvents are separated on plant scale by parallel plate separators or by froth floatation cells, while the dissolved solvents are recovered either by organic diluent wash or by charcoal adsorption. A novel process has been developed to recover uranium from merchant grade phosphoric acid (MGA) employing synergistic mixture of DNPPA (di-nonyl phenyl phosphoric acid ) and TOPO (tri-n-octyl phosphine oxide) dissolved in petrofin. After recovery of uranium, MGA has to be returned to the host company for the production of fertilizer. This MGA has to be free from any contamination due to DNPPA and TOPO. Separation of DNPPA and TOPO from MGA by diluent wash method has been reported. There is no information available in literature for the separation of DNPPA and TOPO from such aqueous streams by carbon adsorption. The present investigation describes the methodology based on charcoal adsorption study (batch and continuous column operation) to separate DNPPA from MGA. Three different types of charcoal namely coconut shell based, coal based and pelletized charcoal were evaluated for DNPPA separation from MGA containing 100 mg/L DNPPA. It was found that the % DNPPA adsorptions in single contact (0.5g C/50 ml) were 57, 34 and 10 in coconut shell, coal based and pelletised charcoal respectively. Based on the results, the coconut shell based charcoal was selected for further study. Adsorption of DNPPA by coconut shell based charcoal was investigated by carrying out the experiments with 50 ml MGA containing 770 mg/L DNPPA by adding 1 to 7 g charcoal respectively in separate beakers

  6. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  7. Operation of a Hovercraft Scientific Platform Over Sea Ice in the Arctic Ocean Transpolar Drift (81 - 85N): The FRAM-2012 Experience

    Science.gov (United States)

    Hall, J. K.; Kristoffersen, Y.

    2013-12-01

    We have tested the feasibility of hovercraft travel through predominantly first year ice of the Transpolar Drift between 81°N - 85°N north of Svalbard. With 2-9 ridges per kilometer, our hovercraft (Griffon TD2000 Mark II), with an effective hover height of about 0.5 m, had to travel a distance 1.3 times the great circle distance between the point of origin and the final destination. Instantaneous speeds were mostly 5-7 knots. Two weeks later icebreaker Oden completed the same transit under conditions with no significant pressure in the ice at a speed mostly 1 knot higher than the hovercraft and travelled 1.2 times the great circle distance. The hovercraft spent 25 days monitoring micro-earthquake activity of the Arctic Mid-Ocean Ridge at a section of the spreading center where no seismicity has been recorded by the global seismograph network. More than ten small earthquake events per day were recorded. Visibility appears to be the most critical factor to hovercraft travel in polar pack ice. Improved control of hovercraft motion would substantially increase the potential usefulness of hovercraft in the sea ice environment. University of Bergen graduate student Gaute Hope emplacing one of the hydrophones in the triangular array used to locate small earthquakes over the Gakkel Ridge rift valley around 85N during FRAM-2012. The research hovercraft R/H SABVABAA is in the background.

  8. Sa Pusod ng Lungsod: Mga Alamat, Mga Kababalaghan Bilang Mitolohiyang Urban

    Directory of Open Access Journals (Sweden)

    Eugene Y. Evasco

    2000-06-01

    Full Text Available Local urban legends such as narratives about supernatural occurrences and mythological characters-the manananggal of Tondo; white lady at Loakan Road, Baguio, and Balete Drive, Quezon City; the kambal-ahas of Robinson's Galleria and a mall in Davao City; pugot-ulo of Tagaytay City; engkantada of the market; tiyanak; mandaragit (Haring Kuto; salvation army of Calbayog City; the lost souls in the buildings which have existed since the Philippine Revolution and the World War II; and the anitos in mango and aratiles trees within the city-are intimately connected with the folk beliefs, philosophy, and mythology of the country. Interview results from various urbanized locales in the Philippines provide the context of the metaphor, motif, and images in the stories via a yearlong documentation and transcription. They also corroborate what sociologists and anthropologists have previously concluded, that the tensions due to urbanization, alienation, politics, technology, militarization, ecological degradation, and industrialization have been instrumental in the creation of modern folklore. The development of oral literature in the Philippines and the use of urban legends have effectively contributed to the production of popular culture and literature of the Philippines.

  9. Correlation among the dysphonia severity index (DSI), the RBH voice perceptual evaluation, and minimum glottal area in female patients with vocal fold nodules.

    Science.gov (United States)

    Hussein Gaber, Ammar Gaber; Liang, Fa-Ya; Yang, Jin-Shan; Wang, Ya-Jing; Zheng, Yi-Qing

    2014-01-01

    To investigate the clinical significance and correlation of the dysphonia severity index (DSI), the RBH (roughness [R]; breathiness [B]; hoarseness [H]) perceptual voice quality evaluation, and minimum glottal area (MGA) in patients with vocal fold nodules and validate the practicality of the DSI further. The DSI evaluation, the voice RBH perceptual evaluation, and the MGA were performed on 30 female patients with vocal fold nodules (the patient group) and 30 female volunteers with normal voices (the control group). The DSI determination was calculated using the following formula: DSI = 0.13 × MPT + 0.0053 × F(0)-High - 0.26 × I-Low - 1.18 × Jitter(%) + 12.4. The RBH evaluation was graded according to four scales. The MGA was measured by KayPENTAX Kips (7105) software. The differences among the DSI, the RBH grade, and MGA of the patients were compared. The median DSI values of the patient group and the control group were -0.81 and 3.79, respectively, and the difference was statistically significant (P dysphonia in female patients with vocal nodules has significant clinical application and good correlation with MGA measurement. Copyright © 2014 The Voice Foundation. All rights reserved.

  10. Improvement of the Response Time in an Open Source Audioconference Architecture Based on SIP Multicast Implemented with JainSIP, JainSDP and JGAP Libraries

    Directory of Open Access Journals (Sweden)

    Carlos M. Moreno

    2014-06-01

    Full Text Available Group services like the audioconference require a minimum level of quality of service for multicast sessions. This work proposes a new overlay multicast architecture based on SIP extensions and a genetic algorithm. The architecture consists of a SIP Extender client (SE, a Multicast Gateway Agent (MGA and a Multicast Manager (MM. The SE receives information about the most adequate MGA for it determined by a genetic algorithm inside the MM, then connects the chosen MGA and maintains connection with the MM itself. The genetic algorithm is implemented with JGAP(Java Genetic Algorithm Package libraries. The SE and MGA are programmed with JainSIP and JainSDP libraries which contain Java structures associated with the SIP protocol and session description. Some experiments over UTP wired and WiFi IEEE802.11n network were performed. Partial results with static and dynamic MGA selection show that, if we compare the joining and leaving time measured inside a station containing SE client programmed with JainSIP and JainSDP libraries versus SJphone proprietary client, the software engineering may have more influence than the medium access method in the response time for a potential group member. Even more, the genetic algorithm at the MM minimizes the response time at great scale.

  11. Cubilin P1297L mutation associated with hereditary megaloblastic anemia 1 causes impaired recognition of intrinsic factor-vitamin B(12) by cubilin

    DEFF Research Database (Denmark)

    Kristiansen, M; Aminoff, M; Jacobsen, Christian

    2000-01-01

    Megaloblastic anemia 1 (MGA1) is an autosomal recessive disorder caused by the selective intestinal malabsorption of intrinsic factor (IF) and vitamin B(12)/cobalamin (Cbl) in complex. Most Finnish patients with MGA1 carry the disease-specific P1297L mutation (FM1) in the IF-B(12) receptor, cubilin......-IF-Cbl in cubilin-expressing epithelial cells. In conclusion, the data presented show a substantial loss in affinity of the FM1 mutant form of the IF-Cbl binding region of cubilin. This now explains the malabsorption of Cbl and Cbl-dependent anemia in MGA1 patients with the FM1 mutation. (Blood. 2000...

  12. Ecotoxicological impact of Zequanox®, a novel biocide, on selected non-target Irish aquatic species.

    Science.gov (United States)

    Meehan, Sara; Shannon, Adam; Gruber, Bridget; Rackl, Sarahann M; Lucy, Frances E

    2014-09-01

    Effective, species-specific zebra mussel control is needed urgently for Ireland׳s freshwater bodies, which became infested with non-native zebra mussels in the 1990s. Zequanox®, a newly commercialized product for zebra and quagga mussel control, is composed of dead Pseudomonas fluorescens CL 145A cells. This paper describes ecotoxicology tests on three representative native Irish freshwater species: Anodonta (duck mussel), Chironomus plumosus (non-biting midge), and Austropotamobius pallipes (white-clawed crayfish). The species were exposed to Zequanox in a 72-h static renewal toxicity test at concentrations of 100-750mg active ingredient per liter (mga.i./L). Water quality parameters were measured every 12-24h before and after water and product renewal. After 72h, endpoints were reported as LC10, LC50, and LC100. The LC50 values derived were (1) Anodonta: ≥500mga.i./L (2) C. plumosus: 1075mga.i./L, and (3) A. pallipes: ≥750mga.i./L. These results demonstrate that Zequanox does not negatively affect these organisms at the concentration required for >80percent zebra mussel mortality (150mg a.i/L) and the maximum allowable treatment concentration in the United Sates (200mga.i./L). They also show the overall species-specificity of Zequanox, and support its use in commercial facilities and open waters. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  14. Water use and quality of fresh surface-water resources in the Barataria-Terrebonne Basins, Louisiana

    Science.gov (United States)

    Johnson-Thibaut, Penny M.; Demcheck, Dennis K.; Swarzenski, Christopher M.; Ensminger, Paul A.

    1998-01-01

    Approximately 170 Mgal/d (million gallons per day) of ground- and surface-water was withdrawn from the Barataria-Terrebonne Basins in 1995. Of this amount, surface water accounted for 64 percent ( 110 MgaVd) of the total withdrawal rates in the basins. The largest surface-water withdrawal rates were from Bayou Lafourche ( 40 Mgal/d), Bayou Boeuf ( 14 MgaVd), and the Gulf Intracoastal Waterway (4.2 Mgal/d). The largest ground-water withdrawal rates were from the Mississippi River alluvial aquifer (29 Mgal/d), the Gonzales-New Orleans aquifer (9.5 Mgal/d), and the Norco aquifer (3.6 MgaVd). The amounts of water withdrawn in the basins in 1995 differed by category of use. Public water suppliers within the basins withdrew 41 Mgal/d of water. The five largest public water suppliers in the basins withdrew 30 Mgal/d of surface water: Terrebonne Waterworks District 1 withdrew the largest amount, almost 15 MgaVd. Industrial facilities withdrew 88 Mgal/d, fossil-fuel plants withdrew 4.7 MgaVd, and commercial facilities withdrew 0.67 MgaVd. Aggregate water-withdrawal rates, compiled by parish for aquaculture (37 Mgal/d), livestock (0.56 Mgal/d), rural domestic (0.44 MgaVd), and irrigation uses (0.54 MgaVd), totaled about 38 MgaVd in the basins. Ninety-five percent of aquaculture withdrawal rates, primarily for crawfish and alligator farming, were from surface-water sources. >br> Total water-withdrawal rates increased 221 percent from 1960–95. Surface-water withdrawal rates have increased by 310 percent, and ground-water withdrawal rates have increased by 133 percent. The projection for the total water-withdrawal rates in 2020 is 220 MgaVd, an increase of 30 percent from 1995. Surface-water withdrawal rates would account for 59 percent of the total, or 130 Mgal/d. Surface-water withdrawal rates are projected to increase by 20 percent from 1995 to 2020. Analysis of water-quality data from the Mississippi River indicates that the main threats to surface water resources are

  15. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  16. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  17. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  18. Risk assessment of insecticides used in rice on miridbug, Cyrtorhinus lividipennis Reuter, the important predator of brown planthopper, Nilaparvata lugens (Stal.).

    Science.gov (United States)

    Preetha, G; Stanley, J; Suresh, S; Samiyappan, R

    2010-07-01

    The green miridbug, Cyrtorhinus lividipennis, an important natural enemy of the rice brown planthopper (BPH), Nilaparvata lugens plays a major role as a predator in suppressing the pest population. The study assessed the impact of certain potential insecticides used in the rice ecosystem on the miridbug predator and brown planthopper through contact toxicity. Eleven insecticides, including neonicotinoids, diamides, azomethine pyridines, carbamates, pyrethroids, organophosphates and cyclodienes were selected to test their toxicities against the nymphs of C. lividipennis and N. lugens. Median lethal concentration (LC(50)) was determined for each insecticide using an insecticide-coated vial (scintillation) residue bioassay, which revealed BPMC as the highly toxic chemical with an LC(50) of 0.003mga.iL(-1) followed by ethofenprox and clothianidin with LC(50) of 0.006mga.iL(-1) at 48 HAT against C. lividipennis and ethofenprox as the highly toxic chemical with an LC(50) of 0.009mga.iL(-1) followed by clothianidin with an LC(50) of 0.211mga.iL(-1) at 48h after treatment (HAT) against N. lugens. Among the insecticides tested, the cyclodiene compound, endosulfan had the lowest acute contact toxicity (LC(50)=66.65mga.iL(-1) at 48 HAT) to C. lividipennis. Among the insecticides tested, endosulfan, chlorpyriphos, acephate and methyl parathion are regarded as safer to C. lividipennis based on selectivity ratio, hazard quotient and probit substitution method of risk assessments. 2010 Elsevier Ltd. All rights reserved.

  19. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  20. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  1. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  2. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Patriarca, Riccardo; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-01-01

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  3. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    Energy Technology Data Exchange (ETDEWEB)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco; Tronci, Massimo

    2017-03-15

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order to define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.

  4. FLUTAN input specifications

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Baumann, W.; Willerding, G.

    1991-05-01

    FLUTAN is a highly vectorized computer code for 3-D fluiddynamic and thermal-hydraulic analyses in cartesian and cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA. To a large extent, FLUTAN relies on basic concepts and structures imported from COMMIX-1B and COMMIX-2 which were made available to KfK in the frame of cooperation contracts in the fast reactor safety field. While on the one hand not all features of the original COMMIX versions have been implemented in FLUTAN, the code on the other hand includes some essential innovative options like CRESOR solution algorithm, general 3-dimensional rebalacing scheme for solving the pressure equation, and LECUSSO-QUICK-FRAM techniques suitable for reducing 'numerical diffusion' in both the enthalphy and momentum equations. This report provides users with detailed input instructions, presents formulations of the various model options, and explains by means of comprehensive sample input, how to use the code. (orig.) [de

  5. Motivation of hens to obtain feed during a molt induced by feed withdrawal, wheat middlings, or melengestrol acetate.

    Science.gov (United States)

    Koch, J M; Lay, D C; McMunn, K A; Moritz, J S; Wilson, M E

    2007-04-01

    Traditionally, molting was initiated by withdrawing feed. However, public criticism of feed deprivation, based on the perception that it inhumanely increases hunger, has led the poultry industry to ban the practice. Thus far, alternatives have not been demonstrated to ameliorate the increase in hunger that led to the ban on inducing molting by feed deprivation. Incorporating melengestrol acetate (MGA), an orally active progestin, into a balanced layer diet induces molting and increases postmolt egg quality. Hy-Line W-98 hens (n = 60) were randomly assigned to a balanced layer ration (control), a balanced layer ration containing MGA, or a 94% wheat middlings diet (wheat) for 20 d, or were feed deprived for 8 d. Hens were trained to peck a switch to receive a feed reward based on a progressive ratio reinforcement schedule. Motivation of hens to acquire feed was measured as the total number of pecks recorded in 15 min on d 0, 4, 8, 12, 16, and 20. On d 20, abdominal fat pad and digesta-free gizzards were weighed. The number of pecks in the feed-deprived group was greater than controls by d 4 and remained greater at d 8, when these hens were removed from the experiment. Hens in the wheat group that were rewarded with a layer diet pecked more than controls from d 8 to 20. Hens in the MGA group pecked for a reward at the same rate as control hens throughout the experiment. Hens fed the wheat diet had heavier gizzards compared with control and MGA-fed hens. Hens fed MGA had greater abdominal fat pad compared with wheat and control hens. Hens molted using a diet containing MGA have a similar motivation to obtain feed as control hens; therefore, this alternative does not appear to increase hunger. However, hens molted with a wheat middling diet appear to be as motivated to obtain feed as did the feed-deprived hens.

  6. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  7. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  8. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  9. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...... decoding algorithm for matrix-product codes with polynomial units, which are quasi-cyclic codes. Furthermore, it allows us to consider unique decoding for matrix-product codes with polynomial units....

  10. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  11. Combinatorial neural codes from a mathematical coding theory perspective.

    Science.gov (United States)

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  12. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  13. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  14. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  15. Bathymetric patterns in standing stock and diversity of deep-sea nematodes at the long-term ecological research observatory HAUSGARTEN (Fram Strait)

    Science.gov (United States)

    Grzelak, Katarzyna; Kotwicki, Lech; Hasemann, Christiane; Soltwedel, Thomas

    2017-08-01

    Bathymetric patterns in standing stocks and diversity are a major topic of investigation in deep-sea biology. From the literature, responses of metazoan meiofauna and nematodes to bathymetric gradients are well studied, with a general decrease in biomass and abundance with increasing water depth, while bathymetric diversity gradients often, although it is not a rule, show a unimodal pattern. Spatial distribution patterns of nematode communities along bathymetric gradients are coupled with surface-water processes and interacting physical and biological factors within the benthic system. We studied the nematode communities at the Long-Term Ecological Research (LTER) observatory HAUSGARTEN, located in the Fram Strait at the Marginal Ice Zone, with respect to their standing stocks as well as structural and functional diversity. We evaluated whether nematode density, biomass and diversity indices, such as H0, Hinf, EG(50), Θ- 1, are linked with environmental conditions along a bathymetric transect spanning from 1200 m to 5500 m water depth. Nematode abundance, biomass and diversity, as well as food availability from phytodetritus sedimentation (indicated by chloroplastic pigments in the sediments), were higher at the stations located at upper bathyal depths (1200-2000 m) and tended to decrease with increasing water depth. A faunal shift was found below 3500 m water depth, where genus composition and trophic structure changed significantly and structural diversity indices markedly decreased. A strong dominance of very few genera and its high turnover particularly at the abyssal stations (4000-5500 m) suggests that environmental conditions were rather unfavorable for most genera. Despite the high concentrations of sediment-bound chloroplastic pigments and elevated standing stocks found at the deepest station (5500 m), nematode genus diversity remained the lowest compared to all other stations. This study provides a further insight into the knowledge of deep-sea nematodes

  16. Remodeling of the Nuclear Envelope and Lamina during Bovine Preimplantation Development and Its Functional Implications.

    Directory of Open Access Journals (Sweden)

    Jens Popken

    Full Text Available The present study demonstrates a major remodeling of the nuclear envelope and its underlying lamina during bovine preimplantation development. Up to the onset of major embryonic genome activation (MGA at the 8-cell stage nuclei showed a non-uniform distribution of nuclear pore complexes (NPCs. NPCs were exclusively present at sites where DNA contacted the nuclear lamina. Extended regions of the lamina, which were not contacted by DNA, lacked NPCs. In post-MGA nuclei the whole lamina was contacted rather uniformly by DNA. Accordingly, NPCs became uniformly distributed throughout the entire nuclear envelope. These findings shed new light on the conditions which control the integration of NPCs into the nuclear envelope. The switch from maternal to embryonic production of mRNAs was accompanied by multiple invaginations covered with NPCs, which may serve the increased demands of mRNA export and protein import. Other invaginations, as well as interior nuclear segments and vesicles without contact to the nuclear envelope, were exclusively positive for lamin B. Since the abundance of these invaginations and vesicles increased in concert with a massive nuclear volume reduction, we suggest that they reflect a mechanism for fitting the nuclear envelope and its lamina to a shrinking nuclear size during bovine preimplantation development. In addition, a deposit of extranuclear clusters of NUP153 (a marker for NPCs without associated lamin B was frequently observed from the zygote stage up to MGA. Corresponding RNA-Seq data revealed deposits of spliced, maternally provided NUP153 mRNA and little unspliced, newly synthesized RNA prior to MGA, which increased strongly at the initiation of embryonic expression of NUP153 at MGA.

  17. New quantum codes constructed from quaternary BCH codes

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  18. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  19. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  20. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  1. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  2. A model study of the first ventilated regime of the Arctic Ocean during the early Miocene

    Directory of Open Access Journals (Sweden)

    Bijoy Thompson

    2012-07-01

    Full Text Available The tectonic opening of Fram Strait during the Neogene was a significant geological event that transferred the Arctic Ocean from a poorly ventilated enclosed basin, with weak exchange with the North Atlantic, to a fully ventilated “ocean stage”. Previous tectonic and physical oceanographic analyses suggest that the early Miocene Fram Strait was likely several times narrower and less than half as deep as the present-day 400 km wide and 2550 m deep strait. Here we use an ocean general circulation model with a passive age tracer included to further address the effect of the Fram Strait opening on the early Miocene Arctic Ocean circulation. The model tracer age exhibits strong spatial gradient between the two major Arctic Ocean deep basins: the Eurasian and Amerasian basins. There is a two-layer stratification and the exchange flow through Fram Strait shows a bi-layer structure with a low salinity outflow from the Arctic confined to a relatively thin upper layer and a saline inflow from the North Atlantic below. Our study suggests that although Fram Strait was significantly narrower and shallower during early Miocene, and the ventilation mechanism quite different in our model, the estimated ventilation rates are comparable to the chemical tracer estimates in the present-day Arctic Ocean. Since we achieved ventilation of the Arctic Ocean with a prescribed Fram Strait width of 100 km and sill depth of 1000 m, ventilation may have preceded the timing of a full ocean depth connection between the Arctic Ocean and North Atlantic established through seafloor spreading and the development of the Lena Trough.

  3. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  4. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  5. Image quality assessment based on multiscale geometric analysis.

    Science.gov (United States)

    Gao, Xinbo; Lu, Wen; Tao, Dacheng; Li, Xuelong

    2009-07-01

    Reduced-reference (RR) image quality assessment (IQA) has been recognized as an effective and efficient way to predict the visual quality of distorted images. The current standard is the wavelet-domain natural image statistics model (WNISM), which applies the Kullback-Leibler divergence between the marginal distributions of wavelet coefficients of the reference and distorted images to measure the image distortion. However, WNISM fails to consider the statistical correlations of wavelet coefficients in different subbands and the visual response characteristics of the mammalian cortical simple cells. In addition, wavelet transforms are optimal greedy approximations to extract singularity structures, so they fail to explicitly extract the image geometric information, e.g., lines and curves. Finally, wavelet coefficients are dense for smooth image edge contours. In this paper, to target the aforementioned problems in IQA, we develop a novel framework for IQA to mimic the human visual system (HVS) by incorporating the merits from multiscale geometric analysis (MGA), contrast sensitivity function (CSF), and the Weber's law of just noticeable difference (JND). In the proposed framework, MGA is utilized to decompose images and then extract features to mimic the multichannel structure of HVS. Additionally, MGA offers a series of transforms including wavelet, curvelet, bandelet, contourlet, wavelet-based contourlet transform (WBCT), and hybrid wavelets and directional filter banks (HWD), and different transforms capture different types of image geometric information. CSF is applied to weight coefficients obtained by MGA to simulate the appearance of images to observers by taking into account many of the nonlinearities inherent in HVS. JND is finally introduced to produce a noticeable variation in sensory experience. Thorough empirical studies are carried out upon the LIVE database against subjective mean opinion score (MOS) and demonstrate that 1) the proposed framework has

  6. Converter of a continuous code into the Grey code

    International Nuclear Information System (INIS)

    Gonchar, A.I.; TrUbnikov, V.R.

    1979-01-01

    Described is a converter of a continuous code into the Grey code used in a 12-charged precision amplitude-to-digital converter to decrease the digital component of spectrometer differential nonlinearity to +0.7% in the 98% range of the measured band. To construct the converter of a continuous code corresponding to the input signal amplitude into the Grey code used is the regularity in recycling of units and zeroes in each discharge of the Grey code in the case of a continuous change of the number of pulses of a continuous code. The converter is constructed on the elements of 155 series, the frequency of continuous code pulse passing at the converter input is 25 MHz

  7. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  8. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  9. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  10. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  11. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  12. Benchmarking of gene prediction programs for metagenomic data.

    Science.gov (United States)

    Yok, Non; Rosen, Gail

    2010-01-01

    This manuscript presents the most rigorous benchmarking of gene annotation algorithms for metagenomic datasets to date. We compare three different programs: GeneMark, MetaGeneAnnotator (MGA) and Orphelia. The comparisons are based on their performances over simulated fragments from one hundred species of diverse lineages. We defined four different types of fragments; two types come from the inter- and intra-coding regions and the other types are from the gene edges. Hoff et al. used only 12 species in their comparison; therefore, their sample is too small to represent an environmental sample. Also, no predecessors has separately examined fragments that contain gene edges as opposed to intra-coding regions. General observations in our results are that performances of all these programs improve as we increase the length of the fragment. On the other hand, intra-coding fragments of our data show low annotation error in all of the programs if compared to the gene edge fragments. Overall, we found an upper-bound performance by combining all the methods.

  13. Organic additives stabilize RNA aptamer binding of malachite green.

    Science.gov (United States)

    Zhou, Yubin; Chi, Hong; Wu, Yuanyuan; Marks, Robert S; Steele, Terry W J

    2016-11-01

    Aptamer-ligand binding has been utilized for biological applications due to its specific binding and synthetic nature. However, the applications will be limited if the binding or the ligand is unstable. Malachite green aptamer (MGA) and its labile ligand malachite green (MG) were found to have increasing apparent dissociation constants (Kd) as determined through the first order rate loss of emission intensity of the MGA-MG fluorescent complex. The fluorescent intensity loss was hypothesized to be from the hydrolysis of MG into malachite green carbinol base (MGOH). Random screening organic additives were found to reduce or retain the fluorescence emission and the calculated apparent Kd of MGA-MG binding. The protective effect became more apparent as the percentage of organic additives increased up to 10% v/v. The mechanism behind the organic additive protective effects was primarily from a ~5X increase in first order rate kinetics of MGOH→MG (kMGOH→MG), which significantly changed the equilibrium constant (Keq), favoring the generation of MG, versus MGOH without organic additives. A simple way has been developed to stabilize the apparent Kd of MGA-MG binding over 24h, which may be beneficial in stabilizing other triphenylmethane or carbocation ligand-aptamer interactions that are susceptible to SN1 hydrolysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  15. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  16. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  17. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  18. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  19. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  20. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  1. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  2. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  3. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  4. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  5. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  6. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  7. New quantum codes derived from a family of antiprimitive BCH codes

    Science.gov (United States)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  8. Surface acoustic wave coding for orthogonal frequency coded devices

    Science.gov (United States)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  9. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  10. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  11. Monte Carlo simulations of plutonium gamma-ray spectra

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Carlson, J.B.; Wang, Tzu-Fang; Ruhter, W.D.

    1993-01-01

    Monte Carlo calculations were investigated as a means of simulating the gamma-ray spectra of Pu. These simulated spectra will be used to develop and evaluate gamma-ray analysis techniques for various nondestructive measurements. Simulated spectra of calculational standards can be used for code intercomparisons, to understand systematic biases and to estimate minimum detection levels of existing and proposed nondestructive analysis instruments. The capability to simulate gamma-ray spectra from HPGe detectors could significantly reduce the costs of preparing large numbers of real reference materials. MCNP was used for the Monte Carlo transport of the photons. Results from the MCNP calculations were folded in with a detector response function for a realistic spectrum. Plutonium spectrum peaks were produced with Lorentzian shapes, for the x-rays, and Gaussian distributions. The MGA code determined the Pu isotopes and specific power of this calculated spectrum and compared it to a similar analysis on a measured spectrum

  12. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  13. Quantum Codes From Cyclic Codes Over The Ring R 2

    International Nuclear Information System (INIS)

    Altinel, Alev; Güzeltepe, Murat

    2016-01-01

    Let R 2 denotes the ring F 2 + μF 2 + υ 2 + μυ F 2 + wF 2 + μwF 2 + υwF 2 + μυwF 2 . In this study, we construct quantum codes from cyclic codes over the ring R 2 , for arbitrary length n, with the restrictions μ 2 = 0, υ 2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R 2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R 2 and we give an example of quantum error-correcting codes form cyclic codes over R 2 . (paper)

  14. A New Prime Code for Synchronous Optical Code Division Multiple-Access Networks

    Directory of Open Access Journals (Sweden)

    Huda Saleh Abbas

    2018-01-01

    Full Text Available A new spreading code based on a prime code for synchronous optical code-division multiple-access networks that can be used in monitoring applications has been proposed. The new code is referred to as “extended grouped new modified prime code.” This new code has the ability to support more terminal devices than other prime codes. In addition, it patches subsequences with “0s” leading to lower power consumption. The proposed code has an improved cross-correlation resulting in enhanced BER performance. The code construction and parameters are provided. The operating performance, using incoherent on-off keying modulation and incoherent pulse position modulation systems, has been analyzed. The performance of the code was compared with other prime codes. The results demonstrate an improved performance, and a BER floor of 10−9 was achieved.

  15. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    Science.gov (United States)

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  16. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  17. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  18. Some Families of Asymmetric Quantum MDS Codes Constructed from Constacyclic Codes

    Science.gov (United States)

    Huang, Yuanyuan; Chen, Jianzhang; Feng, Chunhui; Chen, Riqing

    2018-02-01

    Quantum maximal-distance-separable (MDS) codes that satisfy quantum Singleton bound with different lengths have been constructed by some researchers. In this paper, seven families of asymmetric quantum MDS codes are constructed by using constacyclic codes. We weaken the case of Hermitian-dual containing codes that can be applied to construct asymmetric quantum MDS codes with parameters [[n,k,dz/dx

  19. Theoretical Atomic Physics code development II: ACE: Another collisional excitation code

    International Nuclear Information System (INIS)

    Clark, R.E.H.; Abdallah, J. Jr.; Csanak, G.; Mann, J.B.; Cowan, R.D.

    1988-12-01

    A new computer code for calculating collisional excitation data (collision strengths or cross sections) using a variety of models is described. The code uses data generated by the Cowan Atomic Structure code or CATS for the atomic structure. Collisional data are placed on a random access file and can be displayed in a variety of formats using the Theoretical Atomic Physics Code or TAPS. All of these codes are part of the Theoretical Atomic Physics code development effort at Los Alamos. 15 refs., 10 figs., 1 tab

  20. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  1. Proceedings of the Ground-Water Detection Workshop Held at Vicksburg, Mississippi on 12-14 January 1982.

    Science.gov (United States)

    1984-12-01

    8217 13f 0 X 13f15 j3e30 3e4’. KOS I 13 I GRAVITY STATION WIENL Ks0 CONTOUR LINE OF RELATIVE BOUGUER ANOMALY (interval 10 mgal)JEL > 40mgaI EJ 204mgaI c...for utilizing the geologic and hydrologic data in evaluating ground-water supplies. Examples from ground-water reports on the Arabian Peninsula area...well as a few examples of ground- water availability maps for the Arabian Peninsula and Libya, will be presented in this paper. US Government agencies

  2. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  3. Error floor behavior study of LDPC codes for concatenated codes design

    Science.gov (United States)

    Chen, Weigang; Yin, Liuguo; Lu, Jianhua

    2007-11-01

    Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.

  4. Explorations of new selenites of the group IIIA and IVA metals

    International Nuclear Information System (INIS)

    Kong Fang; Li Peicin; Zhang Suyun; Mao Jianggao

    2012-01-01

    Systematic explorations of new phases in the Ga III /In III /Ge IV –Se IV –O systems by hydrothermal syntheses or solid-state reactions at high-temperature led to six new ternary compounds, namely, M 2 Se 2 O 7 (M=Ga 1, In 2), M(OH)(SeO 3 ) (M=Ga 3, In 4), α-Ge(SeO 3 ) 2 5 and β-Ge(SeO 3 ) 2 6. Ga 2 Se 2 O 7 1 displays a 3D open framework composed of 2D gallium oxide layers being further bridged and capped by SeO 3 groups. In 2 Se 2 O 7 2 features a 3D indium oxide framework formed by corner- and edge- sharing InO 6 octahedra with SeO 3 groups attached on the cavities and the 8-member ring tunnels of the structure. The isostructural of M(OH)(SeO 3 ) (M=Ga 3, In 4) exhibit a 2D metal selenite layer composed of 1D edge-sharing MO 6 octahedral chains that are interconnected by SeO 3 groups. α-Ge(SeO 3 ) 2 (P2 1 /n) 5 displays a 3D open framework with 1D 8-member ring tunnels along the a-axis while β-Ge(SeO 3 ) 2 (Pa-3) 6 exhibits a condensed 3D network. - Graphical abstract: Highlights: ► Up to now, selenites of the group IIIA and IVA metals are still rare. ► Hydrothermal or solid state reactions yielded six new compounds in this system. ► They are M 2 Se 2 O 7 (M=Ga, In), M(OH)(SeO 3 ) (M=Ga, In), α-Ge(SeO 3 ) 2 and β-Ge(SeO 3 ) 2 . ► They exhibit four different 3D and one 2D structural types. ► α-Ge(SeO 3 ) 2 and β-Ge(SeO 3 ) 2 represent the first examples of germanium selenites.

  5. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  6. CodeArmor : Virtualizing the Code Space to Counter Disclosure Attacks

    NARCIS (Netherlands)

    Chen, Xi; Bos, Herbert; Giuffrida, Cristiano

    2017-01-01

    Code diversification is an effective strategy to prevent modern code-reuse exploits. Unfortunately, diversification techniques are inherently vulnerable to information disclosure. Recent diversification-aware ROP exploits have demonstrated that code disclosure attacks are a realistic threat, with an

  7. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  8. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  9. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    Science.gov (United States)

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p code (odds ratio 310.0, p coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  10. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  11. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  12. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  13. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  14. Low Complexity List Decoding for Polar Codes with Multiple CRC Codes

    Directory of Open Access Journals (Sweden)

    Jong-Hwan Kim

    2017-04-01

    Full Text Available Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.

  15. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  16. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  17. Kakaibang Halimuyak: Iba't Ibang Imahe ng Pangangapital sa Balintawak

    Directory of Open Access Journals (Sweden)

    Mariah Amour C. Miranda

    2015-06-01

    Full Text Available Ginagalugad ng sanaysay na ito ang isang pangunahing lugar na nadaraanan mula Norte patungong Maynila—ang Balintawak. Nagsisilbing manibela ng pagsusuri ang kontemporaneong pananaw sa lunan bilang pugad ng mga anyong may magkakasalungat na kahulugan. Binibigyang-pansin ng sanaysay ang iba't ibang imahe ng pagbebenta na matatagpuan sa Balintawak: ang billboard ng kumpanyang Bench na siyang pinakamalaki, ang mismong palengke, at ang mga lakong paninda sa tabing-kalsada. Sa pamamagitan nito, maipapakita ng sanaysay kung paano madadalumat ang Balintawak bilang isang panlipunang texto na sumasalamin sa mga magkakaibang pangangapital sa naturang lunan. This essay explores the space of Balintawak—a focal point of urban travel from northern Luzon to Manila. Using the contemporary view on space as a site of structures with contradictory meanings, the study seeks to understand how various images of market activities in Balintawak create meaningful representations. These images include the gigantic billboard of Bench, the central market itself, and flea markets in the area. The essay shows how Balintawak as a social text serves as a trading center of different forms of capital.

  18. Effects of growth-promoting agents and season on yearling feedlot heifer performance.

    Science.gov (United States)

    Kreikemeier, W M; Mader, T L

    2004-08-01

    Angus x crossbred heifers (270 per trial) were used in an experiment conducted over one 105-d summer and one 104-d winter feeding period. Treatments were identical for each trial and included: 1) control, 2) estrogenic implant (E), 3) trenbolone acetate implant (TBA), 4) E + TBA (ET), 5) melengestrol acetate (MGA) in the feed, and 6) ET + MGA (ETM). Each treatment was replicated in five pens, with nine heifers per pen in each season. Initial weights (mean = 384 kg, SE = 57) were the same for each season. There were no treatment x season interactions for final BW, ADG, G:F, water intake, or carcass characteristics. Heifers receiving a growth-promoting agent were 11.6 kg (SE = 4.08) heavier and gained 0.108 kg/d (SE = 0.04) more (P coldest and hottest portions of the year. Heifers fed MGA and implanted with ET tended (P = 0.07) to have greater DMI in the summer but lesser DMI in the winter. In general, differences among growth-promotant programs were relatively similar over the entire summer and in winter.

  19. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  20. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  1. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  2. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  3. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  4. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  5. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  6. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  7. Ang Social Network sa Facebook ng mga Taga-Batangas at ng mga Taga-Laguna: Isang Paghahambing

    Directory of Open Access Journals (Sweden)

    Jaderick P. Pabico

    2013-12-01

    Full Text Available Online social networking (OSN has become of great influence to Filipinos, where Facebook, Twitter, LinkedIn, Google+, and Instagram are among the popular ones. Their popularity, coupled with their intuitive and interactive use, allow one's personal information such as gender, age, address, relationship status, and list of friends to become publicly available. The accessibility of information from these sites allow, with the aid of computers, for the study of a wide population's characteristics even in a provincial scale. Aside from being neighbouring locales, the respective residents of Laguna and Batangas both derive their livelihoods from two lakes, Laguna de Bay and Taal Lake. Both residents experience similar problems, such as that, among many others, of fish kill. The goal of this research is to find out similarities in their respective online populations, particularly that of Facebook's. With the use of computational dynamic social network analysis (CDSNA, we found out that the two communities are similar, among others, as follows: both populations are dominated by single young female; Homophily was observed when choosing a friend in terms of age (i.e., friendships were created more often between people whose ages do not differ by at most five years; and Heterophily was observed when choosing friends in terms of gender (i.e., more friendships were created between a male and a female than between both people of the same gender. This paper also presents the differences in the structure of the two social networks, such as degrees of separation and preferential attachment.

  8. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  9. FLUTAN 2.0. Input specifications

    International Nuclear Information System (INIS)

    Willerding, G.; Baumann, W.

    1996-05-01

    FLUTAN is a highly vectorized computer code for 3D fluiddynamic and thermal-hydraulic analyses in Cartesian or cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA, and particularly to COMMIX-1A and COMMIX-1B, which were made available to FZK in the frame of cooperation contracts within the fast reactor safety field. FLUTAN 2.0 is an improved version of the FLUTAN code released in 1992. It offers some additional innovations, e.g. the QUICK-LECUSSO-FRAM techniques for reducing numerical diffusion in the k-ε turbulence model equations; a higher sophisticated wall model for specifying a mass flow outside the surface walls together with its flow path and its associated inlet and outlet flow temperatures; and a revised and upgraded pressure boundary condition to fully include the outlet cells in the solution process of the conservation equations. Last but not least, a so-called visualization option based on VISART standards has been provided. This report contains detailed input instructions, presents formulations of the various model options, and explains how to use the code by means of comprehensive sample input. (orig.) [de

  10. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  11. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Austregesilo, H.; Velkov, K. [GRS, Garching (Germany)] [and others

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  12. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    International Nuclear Information System (INIS)

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-01-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes

  13. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  14. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  15. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  16. Performance Analysis of CRC Codes for Systematic and Nonsystematic Polar Codes with List Decoding

    Directory of Open Access Journals (Sweden)

    Takumi Murata

    2018-01-01

    Full Text Available Successive cancellation list (SCL decoding of polar codes is an effective approach that can significantly outperform the original successive cancellation (SC decoding, provided that proper cyclic redundancy-check (CRC codes are employed at the stage of candidate selection. Previous studies on CRC-assisted polar codes mostly focus on improvement of the decoding algorithms as well as their implementation, and little attention has been paid to the CRC code structure itself. For the CRC-concatenated polar codes with CRC code as their outer code, the use of longer CRC code leads to reduction of information rate, whereas the use of shorter CRC code may reduce the error detection probability, thus degrading the frame error rate (FER performance. Therefore, CRC codes of proper length should be employed in order to optimize the FER performance for a given signal-to-noise ratio (SNR per information bit. In this paper, we investigate the effect of CRC codes on the FER performance of polar codes with list decoding in terms of the CRC code length as well as its generator polynomials. Both the original nonsystematic and systematic polar codes are considered, and we also demonstrate that different behaviors of CRC codes should be observed depending on whether the inner polar code is systematic or not.

  17. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  18. Strict optical orthogonal codes for purely asynchronous code-division multiple-access applications

    Science.gov (United States)

    Zhang, Jian-Guo

    1996-12-01

    Strict optical orthogonal codes are presented for purely asynchronous optical code-division multiple-access (CDMA) applications. The proposed code can strictly guarantee the peaks of its cross-correlation functions and the sidelobes of any of its autocorrelation functions to have a value of 1 in purely asynchronous data communications. The basic theory of the proposed codes is given. An experiment on optical CDMA systems is also demonstrated to verify the characteristics of the proposed code.

  19. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  20. Quantum Codes From Negacyclic Codes over Group Ring ( Fq + υFq) G

    International Nuclear Information System (INIS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-01-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring ( F q + υF q ) G . By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over F q . (paper)

  1. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  2. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  3. ComboCoding: Combined intra-/inter-flow network coding for TCP over disruptive MANETs

    Directory of Open Access Journals (Sweden)

    Chien-Chia Chen

    2011-07-01

    Full Text Available TCP over wireless networks is challenging due to random losses and ACK interference. Although network coding schemes have been proposed to improve TCP robustness against extreme random losses, a critical problem still remains of DATA–ACK interference. To address this issue, we use inter-flow coding between DATA and ACK to reduce the number of transmissions among nodes. In addition, we also utilize a “pipeline” random linear coding scheme with adaptive redundancy to overcome high packet loss over unreliable links. The resulting coding scheme, ComboCoding, combines intra-flow and inter-flow coding to provide robust TCP transmission in disruptive wireless networks. The main contributions of our scheme are twofold; the efficient combination of random linear coding and XOR coding on bi-directional streams (DATA and ACK, and the novel redundancy control scheme that adapts to time-varying and space-varying link loss. The adaptive ComboCoding was tested on a variable hop string topology with unstable links and on a multipath MANET with dynamic topology. Simulation results show that TCP with ComboCoding delivers higher throughput than with other coding options in high loss and mobile scenarios, while introducing minimal overhead in normal operation.

  4. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  5. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  6. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  7. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  8. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jin; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2004-07-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures.

  9. Coupling the severe accident code SCDAP with the system thermal hydraulic code MARS

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2004-01-01

    MARS is a best-estimate system thermal hydraulics code with multi-dimensional modeling capability. One of the aims in MARS code development is to make it a multi-functional code system with the analysis capability to cover the entire accident spectrum. For this purpose, MARS code has been coupled with a number of other specialized codes such as CONTEMPT for containment analysis, and MASTER for 3-dimensional kinetics. And in this study, the SCDAP code has been coupled with MARS to endow the MARS code system with severe accident analysis capability. With the SCDAP, MARS code system now has acquired the capability to simulate such severe accident related phenomena as cladding oxidation, melting and slumping of fuel and reactor structures

  10. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  11. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  12. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  14. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  15. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  16. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  17. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  18. Tri-code inductance control rod position indicator with several multi-coding-bars

    International Nuclear Information System (INIS)

    Shi Jibin; Jiang Yueyuan; Wang Wenran

    2004-01-01

    A control rod position indicator named as tri-code inductance control rod position indicator with multi-coding-bars, which possesses simple structure, reliable operation and high precision, is developed. The detector of the indicator is composed of K coils, a compensatory coil and K coding bars. Each coding bar consists of several sections of strong magnetic cores, several sections of weak magnetic cores and several sections of non-magnetic portions. As the control rod is withdrawn, the coding bars move in the center of the coils respectively, while the constant alternating current passes the coils and makes them to create inductance alternating voltage signals. The outputs of the coils are picked and processed, and the tri-codes indicating rod position can be gotten. Moreover, the coding principle of the detector and its related structure are introduced. The analysis shows that the indicator owns more advantage over the coils-coding rod position indicator, so it can meet the demands of the rod position indicating in nuclear heating reactor (NHR). (authors)

  19. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  20. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  1. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  2. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  3. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  4. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  5. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  6. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  7. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  8. Electromechanically cooled germanium radiation detector system

    International Nuclear Information System (INIS)

    Lavietes, Anthony D.; Joseph Mauger, G.; Anderson, Eric H.

    1999-01-01

    We have successfully developed and fielded an electromechanically cooled germanium radiation detector (EMC-HPGe) at Lawrence Livermore National Laboratory (LLNL). This detector system was designed to provide optimum energy resolution, long lifetime, and extremely reliable operation for unattended and portable applications. For most analytical applications, high purity germanium (HPGe) detectors are the standard detectors of choice, providing an unsurpassed combination of high energy resolution performance and exceptional detection efficiency. Logistical difficulties associated with providing the required liquid nitrogen (LN) for cooling is the primary reason that these systems are found mainly in laboratories. The EMC-HPGe detector system described in this paper successfully provides HPGe detector performance in a portable instrument that allows for isotopic analysis in the field. It incorporates a unique active vibration control system that allows the use of a Sunpower Stirling cycle cryocooler unit without significant spectral degradation from microphonics. All standard isotopic analysis codes, including MGA and MGA++, GAMANL, GRPANL and MGAU, typically used with HPGe detectors can be used with this system with excellent results. Several national and international Safeguards organisations including the International Atomic Energy Agency (IAEA) and U.S. Department of Energy (DOE) have expressed interest in this system. The detector was combined with custom software and demonstrated as a rapid Field Radiometric Identification System (FRIS) for the U.S. Customs Service . The European Communities' Safeguards Directorate (EURATOM) is field-testing the first Safeguards prototype in their applications. The EMC-HPGe detector system design, recent applications, and results will be highlighted

  9. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  10. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  11. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  12. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  13. Codes maintained by the LAACG [Los Alamos Accelerator Code Group] at the NMFECC

    International Nuclear Information System (INIS)

    Wallace, R.; Barts, T.

    1990-01-01

    The Los Alamos Accelerator Code Group (LAACG) maintains two groups of design codes at the National Magnetic Fusion Energy Computing Center (NMFECC). These codes, principally electromagnetic field solvers, are used for the analysis and design of electromagnetic components for accelerators, e.g., magnets, rf structures, pickups, etc. In this paper, the status and future of the installed codes will be discussed with emphasis on an experimental version of one set of codes, POISSON/SUPERFISH

  14. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    Science.gov (United States)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  15. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  16. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  17. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  18. Limitation and facilitation of one of the world's most invasive fish: an intercontinental comparison

    Science.gov (United States)

    Budy, Phaedra E.; Thiede, Gary P.; Lobón-Cerviá, Javier; Fernandez, Gustavo Gonzolez; McHugh, Peter; McIntosh, Angus; Vøllestad, Lief Asbjørn; Becares, Eloy; Jellyman, Phillip

    2013-01-01

    Purposeful species introductions offer opportunities to inform our understanding of both invasion success and conservation hurdles. We evaluated factors determining the energetic limitations of brown trout (Salmo trutta) in both their native and introduced ranges. Our focus was on brown trout because they are nearly globally distributed, considered one of the world's worst invaders, yet imperiled in much of their native habitat. We synthesized and compared data describing temperature regime, diet, growth, and maximum body size across multiple spatial and temporal scales, from country (both exotic and native habitats) and major geographic area (MGA) to rivers and years within MGA. Using these data as inputs, we next used bioenergetic efficiency (BioEff), a relative scalar representing a realized percentage of maximum possible consumption (0–100%) as our primary response variable and a multi-scale, nested, mixed statistical model (GLIMMIX) to evaluate variation among and within spatial scales and as a function of density and elevation. MGA and year (the residual) explained the greatest proportion of variance in BioEff. Temperature varied widely among MGA and was a strong driver of variation in BioEff. We observed surprisingly little variation in the diet of brown trout, except the overwhelming influence of the switch to piscivory observed only in exotic MGA. We observed only a weak signal of density-dependent effects on BioEff; however, BioEff remained 2.5 fish/m2. The trajectory of BioEff across the life span of the fish elucidated the substantial variation in performance among MGAs; the maximum body size attained by brown trout was consistently below 400 mm in native habitat but reached 600 mm outside their native range, where brown trout grew rapidly, feeding in part on naive prey fishes. The integrative, physiological approach, in combination with the intercontinental and comparative nature of our study, allowed us to overcome challenges associated with context

  19. The U-Pu inspector, a new instrument to determine the isotopic compositions of uranium and plutonium

    International Nuclear Information System (INIS)

    Verplancke, J.; Van Dyck, R.; Tench, O.; Sielaff, B.

    1994-01-01

    The U/Pu-InSpector is a new integrated, portable instrument that can measure the isotopic composition of samples containing uranium and/or plutonium without prior calibration and without the need for skilled operators. It consists of a Low Energy Germanium detector in a Multi-attitude Cryostat (MAC). A shield and collimator are built-in, directly around the detector element, reducing the weight of this detector and shield to approximately 8 kg with a full dewar. The dewar can quickly and easily be filled with a self-pressurizing funnel. The detector is connected to a small portable battery operated analyzer and a Notebook computer. The spectra are automatically stored and analyzed with the help of the MGA codes for plutonium and/or for uranium. 5 refs., 1 fig

  20. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  1. On the Performance of a Multi-Edge Type LDPC Code for Coded Modulation

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    We present a method to combine error-correction coding and spectral-efficient modulation for transmission over the Additive White Gaussian Noise (AWGN) channel. The code employs signal shaping which can provide a so-called shaping gain. The code belongs to the family of sparse graph codes for which

  2. Transsphenoidal Surgery for Mixed Pituitary Gangliocytoma-Adenomas.

    Science.gov (United States)

    Shepard, Matthew J; Elzoghby, Mohamed A; Ghanim, Daffer; Lopes, M Beatriz S; Jane, John A

    2017-12-01

    Most sellar gangliocytomas are discovered with a concurrent pituitary adenoma, also known as a mixed gangliocytoma-adenoma (MGA). MGAs are rare, with fewer than 100 cases reported in the literature to date and only 1 previously documented surgical series. Because MGAs are radiologically indistinguishable from pituitary adenomas, they are often diagnosed after surgery. Combined with the paucity of clinical outcome data for these tumors, this makes their diagnosis and management challenging. Here we describe the clinical presentation and outcomes of 10 individuals who were diagnosed with a MGA at a single institution. This retrospective case series study included patients diagnosed with a combined sellar MGA between 1993 and 2016. This series comprised 10 patients, mean age of 44 years (range, 28-63 years) diagnosed with an MGA. The mean tumor size was 1.6 cm (range, 0.4-2.4 cm). Five patients presented with acromegaly, and 1 patient had recurrent Cushing disease. Transsphenoidal surgery was performed in all cases, and gross total resection was achieved in 7 patients (70%). Histologically, 9 of the 10 MGAs were identified as mixed somatotroph adenoma-gangliocytomas. The median duration of follow-up was 74 months (range, 2-180 months). Following adjuvant treatment (n = 3), all patients with acromegaly (n = 4) achieved biochemical remission, and no patient experienced recurrence of the pituitary tumor with a median radiographic follow-up of 48 months. MGAs are often associated with a hypersecretory adenoma. Transsphenoidal surgery is well tolerated by most patients, and when performed in combination with adjuvant therapy, a low rate of recurrence and reversal of preoperative endocrinopathy can be expected. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  4. The Regulatory Small RNA MarS Supports Virulence of Streptococcus pyogenes.

    Science.gov (United States)

    Pappesch, Roberto; Warnke, Philipp; Mikkat, Stefan; Normann, Jana; Wisniewska-Kucper, Aleksandra; Huschka, Franziska; Wittmann, Maja; Khani, Afsaneh; Schwengers, Oliver; Oehmcke-Hecht, Sonja; Hain, Torsten; Kreikemeyer, Bernd; Patenge, Nadja

    2017-09-25

    Small regulatory RNAs (sRNAs) play a role in the control of bacterial virulence gene expression. In this study, we investigated an sRNA that was identified in Streptococcus pyogenes (group A Streptococcus, GAS) but is conserved throughout various streptococci. In a deletion strain, expression of mga, the gene encoding the multiple virulence gene regulator, was reduced. Accordingly, transcript and proteome analyses revealed decreased expression of several Mga-activated genes. Therefore, and because the sRNA was shown to interact with the 5' UTR of the mga transcript in a gel-shift assay, we designated it MarS for m ga-activating regulatory sRNA. Down-regulation of important virulence factors, including the antiphagocytic M-protein, led to increased susceptibility of the deletion strain to phagocytosis and reduced adherence to human keratinocytes. In a mouse infection model, the marS deletion mutant showed reduced dissemination to the liver, kidney, and spleen. Additionally, deletion of marS led to increased tolerance towards oxidative stress. Our in vitro and in vivo results indicate a modulating effect of MarS on virulence gene expression and on the pathogenic potential of GAS.

  5. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  6. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  7. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  8. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  9. The pressure-temperature phase diagram of pressure induced organic superconductors β-(BDA-TTP){2}MCl{4} (M = Ga, Fe)

    Science.gov (United States)

    Choi, E. S.; Graf, D.; Brooks, J. S.; Yamada, J.; Tokumoto, M.

    2004-04-01

    We investigate the pressure-temperature phase diagram of β -(BDA-TTP){2}MCl{4} (M=Ga, Fe), which shows a metal-insulator (MI) transition around 120 K at ambient pressure. By applying pressure, the insulating phase is suppressed. When the pressure is higher than 5.5 kbar, the superconducting phase appears in both salts with Tc ˜ 3 K for M=Ga and 2.2 K for M=Fe. We also observed Shubnikov-de Haas (SdH) oscillations at high magnetic field in both salts, where the SdH frequencies are found to be very similar each other. Key words. organic superconductor, pressure, phase diagram.

  10. Utility experience in code updating of equipment built to 1974 code, Section 3, Subsection NF

    International Nuclear Information System (INIS)

    Rao, K.R.; Deshpande, N.

    1990-01-01

    This paper addresses changes to ASME Code Subsection NF and reconciles the differences between the updated codes and the as built construction code, of ASME Section III, 1974 to which several nuclear plants have been built. Since Section III is revised every three years and replacement parts complying with the construction code are invariably not available from the plant stock inventory, parts must be procured from vendors who comply with the requirements of the latest codes. Aspects of the ASME code which reflect Subsection NF are identified and compared with the later Code editions and addenda, especially up to and including the 1974 ASME code used as the basis for the plant qualification. The concern of the regulatory agencies is that if later code allowables and provisions are adopted it is possible to reduce the safety margins of the construction code. Areas of concern are highlighted and the specific changes of later codes are discerned; adoption of which, would not sacrifice the intended safety margins of the codes to which plants are licensed

  11. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  12. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  13. ETR/ITER systems code

    International Nuclear Information System (INIS)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs

  14. Coding and decoding for code division multiple user communication systems

    Science.gov (United States)

    Healy, T. J.

    1985-01-01

    A new algorithm is introduced which decodes code division multiple user communication signals. The algorithm makes use of the distinctive form or pattern of each signal to separate it from the composite signal created by the multiple users. Although the algorithm is presented in terms of frequency-hopped signals, the actual transmitter modulator can use any of the existing digital modulation techniques. The algorithm is applicable to error-free codes or to codes where controlled interference is permitted. It can be used when block synchronization is assumed, and in some cases when it is not. The paper also discusses briefly some of the codes which can be used in connection with the algorithm, and relates the algorithm to past studies which use other approaches to the same problem.

  15. Hermitian self-dual quasi-abelian codes

    Directory of Open Access Journals (Sweden)

    Herbert S. Palines

    2017-12-01

    Full Text Available Quasi-abelian codes constitute an important class of linear codes containing theoretically and practically interesting codes such as quasi-cyclic codes, abelian codes, and cyclic codes. In particular, the sub-class consisting of 1-generator quasi-abelian codes contains large families of good codes. Based on the well-known decomposition of quasi-abelian codes, the characterization and enumeration of Hermitian self-dual quasi-abelian codes are given. In the case of 1-generator quasi-abelian codes, we offer necessary and sufficient conditions for such codes to be Hermitian self-dual and give a formula for the number of these codes. In the case where the underlying groups are some $p$-groups, the actual number of resulting Hermitian self-dual quasi-abelian codes are determined.

  16. Hominoid-specific de novo protein-coding genes originating from long non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Chen Xie

    2012-09-01

    Full Text Available Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA-Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis, which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level.

  17. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  18. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  19. CITOPP, CITMOD, CITWI, Processing codes for CITATION Code

    International Nuclear Information System (INIS)

    Albarhoum, M.

    2008-01-01

    Description of program or function: CITOPP processes the output file of the CITATION 3-D diffusion code. The program can plot axial, radial and circumferential flux distributions (in cylindrical geometry) in addition to the multiplication factor convergence. The flux distributions can be drawn for each group specified by the program and visualized on the screen. CITMOD processes both the output and the input files of the CITATION 3-D diffusion code. CITMOD can visualize both axial, and radial-angular models of the reactor described by CITATION input/output files. CITWI processes the input file (CIT.INP) of CITATION 3-D diffusion code. CIT.INP is processed to deduce the dimensions of the cell whose cross sections can be representative of the homonym reactor component in section 008 of CIT.INP

  20. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  1. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  2. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  3. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  4. Coding training for medical students: How good is diagnoses coding with ICD-10 by novices?

    Directory of Open Access Journals (Sweden)

    Stausberg, Jürgen

    2005-04-01

    Full Text Available Teaching of knowledge and competence in documentation and coding is an essential part of medical education. Therefore, coding training had been placed within the course of epidemiology, medical biometry, and medical informatics. From this, we can draw conclusions about the quality of coding by novices. One hundred and eighteen students coded diagnoses from 15 nephrological cases in homework. In addition to interrater reliability, validity was calculated by comparison with a reference coding. On the level of terminal codes, 59.3% of the students' results were correct. The completeness was calculated as 58.0%. The results on the chapter level increased up to 91.5% and 87.7% respectively. For the calculation of reliability a new, simple measure was developed that leads to values of 0.46 on the level of terminal codes and 0.87 on the chapter level for interrater reliability. The figures of concordance with the reference coding are quite similar. In contrary, routine data show considerably lower results with 0.34 and 0.63 respectively. Interrater reliability and validity of coding by novices is as good as coding by experts. The missing advantage of experts could be explained by the workload of documentation and a negative attitude to coding on the one hand. On the other hand, coding in a DRG-system is handicapped by a large number of detailed coding rules, which do not end in uniform results but rather lead to wrong and random codes. Anyway, students left the course well prepared for coding.

  5. Turbo coding, turbo equalisation and space-time coding for transmission over fading channels

    CERN Document Server

    Hanzo, L; Yeap, B

    2002-01-01

    Against the backdrop of the emerging 3G wireless personal communications standards and broadband access network standard proposals, this volume covers a range of coding and transmission aspects for transmission over fading wireless channels. It presents the most important classic channel coding issues and also the exciting advances of the last decade, such as turbo coding, turbo equalisation and space-time coding. It endeavours to be the first book with explicit emphasis on channel coding for transmission over wireless channels. Divided into 4 parts: Part 1 - explains the necessary background for novices. It aims to be both an easy reading text book and a deep research monograph. Part 2 - provides detailed coverage of turbo conventional and turbo block coding considering the known decoding algorithms and their performance over Gaussian as well as narrowband and wideband fading channels. Part 3 - comprehensively discusses both space-time block and space-time trellis coding for the first time in literature. Par...

  6. Decoding Xing-Ling codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2002-01-01

    This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed.......This paper describes an efficient decoding method for a recent construction of good linear codes as well as an extension to the construction. Furthermore, asymptotic properties and list decoding of the codes are discussed....

  7. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  8. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  9. Error-correction coding for digital communications

    Science.gov (United States)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  10. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  11. MARS code manual volume I: code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Yoon, Churl

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This theory manual provides a complete list of overall information of code structure and major function of MARS including code architecture, hydrodynamic model, heat structure, trip / control system and point reactor kinetics model. Therefore, this report would be very useful for the code users. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  12. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  13. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  14. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  15. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  16. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  18. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  19. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  20. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  1. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  2. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  3. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  4. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    Science.gov (United States)

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  5. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  6. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  7. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  8. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  9. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  10. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  11. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  12. Beam-dynamics codes used at DARHT

    Energy Technology Data Exchange (ETDEWEB)

    Ekdahl, Jr., Carl August [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  13. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  14. Nitrogen-doped multiple graphene aerogel/gold nanostar as the electrochemical sensing platform for ultrasensitive detection of circulating free DNA in human serum.

    Science.gov (United States)

    Ruiyi, Li; Ling, Liu; Hongxia, Bei; Zaijun, Li

    2016-05-15

    Graphene aerogel has attracted increasing attention due to its large specific surface area, high-conductivity and electronic interaction. The paper reported a facile synthesis of nitrogen-doped multiple graphene aerogel/gold nanostar (termed as N-doped MGA/GNS) and its use as the electrochemical sensing platform for detection of double stranded (dsDNA). On the one hand, the N-doped MGA offers a much better electrochemical performance compared with classical graphene aerogel. Interestingly, the performance can be enhanced by only increasing the cycle number of graphene oxide gelation. On the other hand, the hybridization with GNS further enhances the electrocatalytic activity towards Fe(CN)6(3-/4-). In addition, the N-doped MGA/GNS provides a well-defined three-dimensional architecture. The unique structure make it is easy to combine with dsDNA to form the electroactive bioconjugate. The integration not only triggers an ultrafast DNA electron and charge transfer, but also realizes a significant synergy between N-doped MGA, GNS and dsDNA. As a result, the electrochemical sensor based on the hybrid exhibits highly sensitive differential pulse voltammetric response (DPV) towards dsDNA. The DPV signal linearly increases with the increase of dsDNA concentration in the range from 1.0×10(-)(21) g ml(-)(1) to 1.0×10(-16) g ml(-1) with the detection limit of 3.9×10(-22) g ml(-1) (S/N=3). The sensitivity is much more than that of all reported DNA sensors. The analytical method was successfully applied in the electrochemical detection of circulating free DNA in human serum. The study also opens a window on the electrical properties of multiple graphene aerogel and DNA as well their hybrids to meet the needs of further applications as special nanoelectronics in molecule diagnosis, bioanalysis and catalysis. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    Energy Technology Data Exchange (ETDEWEB)

    Camous, F.; Jacq, F.; Chatelard, P. [IPSN/DRS/SEMAR CE-Cadarache, St Paul Lez Durance (France)] [and others

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  16. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  17. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  18. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  19. Status of reactor core design code system in COSINE code package

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.; Yu, H.; Liu, Z., E-mail: yuhui@snptc.com.cn [State Nuclear Power Software Development Center, SNPTC, National Energy Key Laboratory of Nuclear Power Software (NEKLS), Beijiing (China)

    2014-07-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  20. Status of reactor core design code system in COSINE code package

    International Nuclear Information System (INIS)

    Chen, Y.; Yu, H.; Liu, Z.

    2014-01-01

    For self-reliance, COre and System INtegrated Engine for design and analysis (COSINE) code package is under development in China. In this paper, recent development status of the reactor core design code system (including the lattice physics code and the core simulator) is presented. The well-established theoretical models have been implemented. The preliminary verification results are illustrated. And some special efforts, such as updated theory models and direct data access application, are also made to achieve better software product. (author)

  1. Further Generalisations of Twisted Gabidulin Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Rosenkilde, Johan Sebastian Heesemann; Sheekey, John

    2017-01-01

    We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes.......We present a new family of maximum rank distance (MRD) codes. The new class contains codes that are neither equivalent to a generalised Gabidulin nor to a twisted Gabidulin code, the only two known general constructions of linear MRD codes....

  2. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  3. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  4. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  5. Application of Quantum Gauss-Jordan Elimination Code to Quantum Secret Sharing Code

    Science.gov (United States)

    Diep, Do Ngoc; Giang, Do Hoang; Phu, Phan Huy

    2018-03-01

    The QSS codes associated with a MSP code are based on finding an invertible matrix V, solving the system vATMB (s a)=s. We propose a quantum Gauss-Jordan Elimination Procedure to produce such a pivotal matrix V by using the Grover search code. The complexity of solving is of square-root order of the cardinal number of the unauthorized set √ {2^{|B|}}.

  6. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  7. The Coding Causes of Death in HIV (CoDe) Project: initial results and evaluation of methodology

    DEFF Research Database (Denmark)

    Kowalska, Justyna D; Friis-Møller, Nina; Kirk, Ole

    2011-01-01

    The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies.......The Coding Causes of Death in HIV (CoDe) Project aims to deliver a standardized method for coding the underlying cause of death in HIV-positive persons, suitable for clinical trials and epidemiologic studies....

  8. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  9. Towers of generalized divisible quantum codes

    Science.gov (United States)

    Haah, Jeongwan

    2018-04-01

    A divisible binary classical code is one in which every code word has weight divisible by a fixed integer. If the divisor is 2ν for a positive integer ν , then one can construct a Calderbank-Shor-Steane (CSS) code, where X -stabilizer space is the divisible classical code, that admits a transversal gate in the ν th level of Clifford hierarchy. We consider a generalization of the divisibility by allowing a coefficient vector of odd integers with which every code word has zero dot product modulo the divisor. In this generalized sense, we construct a CSS code with divisor 2ν +1 and code distance d from any CSS code of code distance d and divisor 2ν where the transversal X is a nontrivial logical operator. The encoding rate of the new code is approximately d times smaller than that of the old code. In particular, for large d and ν ≥2 , our construction yields a CSS code of parameters [[O (dν -1) ,Ω (d ) ,d ] ] admitting a transversal gate at the ν th level of Clifford hierarchy. For our construction we introduce a conversion from magic state distillation protocols based on Clifford measurements to those based on codes with transversal T gates. Our tower contains, as a subclass, generalized triply even CSS codes that have appeared in so-called gauge fixing or code switching methods.

  10. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  11. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation.

  12. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    International Nuclear Information System (INIS)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation

  13. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  14. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  15. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    Science.gov (United States)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  16. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  17. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  18. Under-coding of secondary conditions in coded hospital health data: Impact of co-existing conditions, death status and number of codes in a record.

    Science.gov (United States)

    Peng, Mingkai; Southern, Danielle A; Williamson, Tyler; Quan, Hude

    2017-12-01

    This study examined the coding validity of hypertension, diabetes, obesity and depression related to the presence of their co-existing conditions, death status and the number of diagnosis codes in hospital discharge abstract database. We randomly selected 4007 discharge abstract database records from four teaching hospitals in Alberta, Canada and reviewed their charts to extract 31 conditions listed in Charlson and Elixhauser comorbidity indices. Conditions associated with the four study conditions were identified through multivariable logistic regression. Coding validity (i.e. sensitivity, positive predictive value) of the four conditions was related to the presence of their associated conditions. Sensitivity increased with increasing number of diagnosis code. Impact of death on coding validity is minimal. Coding validity of conditions is closely related to its clinical importance and complexity of patients' case mix. We recommend mandatory coding of certain secondary diagnosis to meet the need of health research based on administrative health data.

  19. Governance codes: facts or fictions? a study of governance codes in colombia1,2

    Directory of Open Access Journals (Sweden)

    Julián Benavides Franco

    2010-10-01

    Full Text Available This article studies the effects on accounting performance and financing decisions of Colombian firms after issuing a corporate governance code. We assemble a database of Colombian issuers and test the hypotheses of improved performance and higher leverage after issuing a code. The results show that the firms’ return on assets after the code introduction improves in excess of 1%; the effect is amplified by the code quality. Additionally, the firms leverage increased, in excess of 5%, when the code quality was factored into the analysis. These results suggest that controlling parties commitment to self restrain, by reducing their private benefits and/or the expropriation of non controlling parties, through the code introduction, is indeed an effective measure and that the financial markets agree, increasing the supply of funds to the firms.

  20. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  1. Purifying selection acts on coding and non-coding sequences of paralogous genes in Arabidopsis thaliana.

    Science.gov (United States)

    Hoffmann, Robert D; Palmgren, Michael

    2016-06-13

    Whole-genome duplications in the ancestors of many diverse species provided the genetic material for evolutionary novelty. Several models explain the retention of paralogous genes. However, how these models are reflected in the evolution of coding and non-coding sequences of paralogous genes is unknown. Here, we analyzed the coding and non-coding sequences of paralogous genes in Arabidopsis thaliana and compared these sequences with those of orthologous genes in Arabidopsis lyrata. Paralogs with lower expression than their duplicate had more nonsynonymous substitutions, were more likely to fractionate, and exhibited less similar expression patterns with their orthologs in the other species. Also, lower-expressed genes had greater tissue specificity. Orthologous conserved non-coding sequences in the promoters, introns, and 3' untranslated regions were less abundant at lower-expressed genes compared to their higher-expressed paralogs. A gene ontology (GO) term enrichment analysis showed that paralogs with similar expression levels were enriched in GO terms related to ribosomes, whereas paralogs with different expression levels were enriched in terms associated with stress responses. Loss of conserved non-coding sequences in one gene of a paralogous gene pair correlates with reduced expression levels that are more tissue specific. Together with increased mutation rates in the coding sequences, this suggests that similar forces of purifying selection act on coding and non-coding sequences. We propose that coding and non-coding sequences evolve concurrently following gene duplication.

  2. The potential impact of multidimesional geriatric assessment in the social security system.

    Science.gov (United States)

    Corbi, Graziamaria; Ambrosino, Immacolata; Massari, Marco; De Lucia, Onofrio; Simplicio, Sirio; Dragone, Michele; Paolisso, Giuseppe; Piccioni, Massimo; Ferrara, Nicola; Campobasso, Carlo Pietro

    2018-01-12

    To evaluate the efficacy of multidimensional geriatric assessment (MGA/CGA) in patients over 65 years old in predicting the release of the accompaniment allowance (AA) indemnity by a Local Medico-Legal Committee (MLC-NHS) and by the National Institute of Social Security Committee (MLC-INPS). In a longitudinal observational study, 200 Italian elder citizens requesting AA were first evaluated by MLC-NHS and later by MLC-INPS. Only MLC-INPS performed a MGA/CGA (including SPMSQ, Barthel Index, GDS-SF, and CIRS). This report was written according to the STROBE guidelines. The data analysis was performed on January 2016. The evaluation by the MLC-NHS and by the MLC-INPS was in agreement in 66% of cases. In the 28%, the AA benefit was recognized by the MLC-NHS, but not by the MLC-INPS. By the multivariate analysis, the best predictors of the AA release, by the MLC-NHS, were represented by gender and the Barthel Index score. The presence of carcinoma, the Barthel Index score, and the SPMQ score were the best predictors for the AA release by MLC-INPS. MGA/CGA could be useful in saving financial resources reducing the risk of incorrect indemnity release. It can improve the accuracy of the impairment assessment in social security system.

  3. Convolutional coding techniques for data protection

    Science.gov (United States)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  4. 21 CFR 106.90 - Coding.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD... of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity with the coding requirements that are applicable to thermally processed low-acid foods packaged in...

  5. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  6. Coding In-depth Semistructured Interviews

    DEFF Research Database (Denmark)

    Campbell, John L.; Quincy, Charles; Osserman, Jordan

    2013-01-01

    Many social science studies are based on coded in-depth semistructured interview transcripts. But researchers rarely report or discuss coding reliability in this work. Nor is there much literature on the subject for this type of data. This article presents a procedure for developing coding schemes...... useful for situations where a single knowledgeable coder will code all the transcripts once the coding scheme has been established. This approach can also be used with other types of qualitative data and in other circumstances....

  7. Polynomial weights and code constructions

    DEFF Research Database (Denmark)

    Massey, J; Costello, D; Justesen, Jørn

    1973-01-01

    polynomial included. This fundamental property is then used as the key to a variety of code constructions including 1) a simplified derivation of the binary Reed-Muller codes and, for any primepgreater than 2, a new extensive class ofp-ary "Reed-Muller codes," 2) a new class of "repeated-root" cyclic codes...... of long constraint length binary convolutional codes derived from2^r-ary Reed-Solomon codes, and 6) a new class ofq-ary "repeated-root" constacyclic codes with an algebraic decoding algorithm.......For any nonzero elementcof a general finite fieldGF(q), it is shown that the polynomials(x - c)^i, i = 0,1,2,cdots, have the "weight-retaining" property that any linear combination of these polynomials with coefficients inGF(q)has Hamming weight at least as great as that of the minimum degree...

  8. QUIC: a chemical kinetics code for use with the chemical equilibrium code QUIL

    International Nuclear Information System (INIS)

    Lunsford, J.L.

    1977-10-01

    A chemical rate kinetics code QUIC is described, along with a support code RATE. QUIC is designed to allow chemical kinetics calculations on a wide variety of chemical environments while operating in the overlay environment of the chemical equilibrium code QUIL. QUIC depends upon a rate-data library called LIBR. This library is maintained by RATE. RATE enters into the library all reactions in a standardized format. The code QUIC, operating in conjunction with QUIL, is interactive and written to be used from a remote terminal, with paging control provided. Plotted output is also available

  9. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  10. Evidence that multiple genetic variants of MC4R play a functional role in the regulation of energy expenditure and appetite in Hispanic children1234

    Science.gov (United States)

    Cole, Shelley A; Voruganti, V Saroja; Cai, Guowen; Haack, Karin; Kent, Jack W; Blangero, John; Comuzzie, Anthony G; McPherson, John D; Gibbs, Richard A

    2010-01-01

    Background: Melanocortin-4-receptor (MC4R) haploinsufficiency is the most common form of monogenic obesity; however, the frequency of MC4R variants and their functional effects in general populations remain uncertain. Objective: The aim was to identify and characterize the effects of MC4R variants in Hispanic children. Design: MC4R was resequenced in 376 parents, and the identified single nucleotide polymorphisms (SNPs) were genotyped in 613 parents and 1016 children from the Viva la Familia cohort. Measured genotype analysis (MGA) tested associations between SNPs and phenotypes. Bayesian quantitative trait nucleotide (BQTN) analysis was used to infer the most likely functional polymorphisms influencing obesity-related traits. Results: Seven rare SNPs in coding and 18 SNPs in flanking regions of MC4R were identified. MGA showed suggestive associations between MC4R variants and body size, adiposity, glucose, insulin, leptin, ghrelin, energy expenditure, physical activity, and food intake. BQTN analysis identified SNP 1704 in a predicted micro-RNA target sequence in the downstream flanking region of MC4R as a strong, probable functional variant influencing total, sedentary, and moderate activities with posterior probabilities of 1.0. SNP 2132 was identified as a variant with a high probability (1.0) of exerting a functional effect on total energy expenditure and sleeping metabolic rate. SNP rs34114122 was selected as having likely functional effects on the appetite hormone ghrelin, with a posterior probability of 0.81. Conclusion: This comprehensive investigation provides strong evidence that MC4R genetic variants are likely to play a functional role in the regulation of weight, not only through energy intake but through energy expenditure. PMID:19889825

  11. Direct-semidirect (DSD) codes

    International Nuclear Information System (INIS)

    Cvelbar, F.

    1999-01-01

    Recent codes for direct-semidirect (DSD) model calculations in the form of answers to a detailed questionnaire are reviewed. These codes include those embodying the classical DSD approach covering only the transitions to the bound states (RAF, HIKARI, and those of the Bologna group), as well as the code CUPIDO++ that also treats transitions to unbound states. (author)

  12. Polar Coding for the Large Hadron Collider: Challenges in Code Concatenation

    CERN Document Server

    AUTHOR|(CDS)2238544; Podzorny, Tomasz; Uythoven, Jan

    2018-01-01

    In this work, we present a concatenated repetition-polar coding scheme that is aimed at applications requiring highly unbalanced unequal bit-error protection, such as the Beam Interlock System of the Large Hadron Collider at CERN. Even though this concatenation scheme is simple, it reveals significant challenges that may be encountered when designing a concatenated scheme that uses a polar code as an inner code, such as error correlation and unusual decision log-likelihood ratio distributions. We explain and analyze these challenges and we propose two ways to overcome them.

  13. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  14. Using fluorescent dissolved organic matter to trace and distinguish the origin of Arctic surface waters

    DEFF Research Database (Denmark)

    Goncalves-Araujo, Rafael; Granskog, Mats A.; Bracher, Astrid

    2016-01-01

    were performed in the Fram and Davis Straits, and on the east Greenland Shelf (EGS), in late summer 2012/2013. Meteoric (f(mw)), sea-ice melt, Atlantic and Pacific water fractions were determined and the fluorescence properties of dissolved organic matter (FDOM) were characterized. In Fram Strait...... and EGS, a robust correlation between visible wavelength fluorescence and f(mw) was apparent, suggesting it as a reliable tracer of polar waters. However, a pattern was observed which linked the organic matter characteristics to the origin of polar waters. At depth in Davis Strait, visible wavelength FDOM...

  15. ”Varken E=mc2 eller Det förlorade paradiset rafsades ihop av en festprisse” : En kvalitativ studie om introvert beteende i skolan

    OpenAIRE

    Larsson, Sofie; Nordqvist, Micaela

    2015-01-01

    Denna studie syftar till att lyfta fram begreppet introvert i skolvärlden genom att undersöka hur introverta beteenden kan påverka elevers utbildning. Vi lyfter även fram lärares tillskrivande av introvert beteende. Vi utgår ifrån Jung och H.J. Eysencks definitioner av begreppet introvert samt tar upp olika beteendeteorier och perspektiv på introvert beteende för att kunna identifiera och analysera våra resultat. Studiens metod är en empirisk datainsamling i form av observationer och intervju...

  16. Construction and decoding of matrix-product codes from nested codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Lally, Kristine; Ruano, Diego

    2009-01-01

    We consider matrix-product codes [C1 ... Cs] · A, where C1, ..., Cs  are nested linear codes and matrix A has full rank. We compute their minimum distance and provide a decoding algorithm when A is a non-singular by columns matrix. The decoding algorithm decodes up to half of the minimum distance....

  17. Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)

    Science.gov (United States)

    Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.

    2013-10-01

    Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.

  18. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  19. The reason for having a code of pharmaceutical ethics: Spanish Pharmacists Code of Ethics

    Directory of Open Access Journals (Sweden)

    Ana Mulet Alberola

    2017-05-01

    Full Text Available The pharmacist profession needs its own code of conduct set out in writing to serve as a stimulus to pharmacists in their day-to-day work in the different areas of pharmacy, in conjunction always with each individual pharmacist´s personal commitment to their patients, to other healthcare professionals and to society. An overview is provided of the different codes of ethics for pharmacists on the national and international scale, the most up-to-date code for 2015 being presented as a set of principles which must guide a pharmacutical conduct from the standpoint of deliberative judgment. The difference between codes of ethics and codes of practice is discussed. In the era of massive-scale collaboration, this code is a project holding bright prospects for the future. Each individual pharmacutical attitude in practicing their profession must be identified with the pursuit of excellence in their own personal practice for the purpose of achieving the ethical and professional values above and beyond complying with regulations and code of practice.

  20. Usage of burnt fuel isotopic compositions from engineering codes in Monte-Carlo code calculations

    International Nuclear Information System (INIS)

    Aleshin, Sergey S.; Gorodkov, Sergey S.; Shcherenko, Anna I.

    2015-01-01

    A burn-up calculation of VVER's cores by Monte-Carlo code is complex process and requires large computational costs. This fact makes Monte-Carlo codes usage complicated for project and operating calculations. Previously prepared isotopic compositions are proposed to use for the Monte-Carlo code (MCU) calculations of different states of VVER's core with burnt fuel. Isotopic compositions are proposed to calculate by an approximation method. The approximation method is based on usage of a spectral functionality and reference isotopic compositions, that are calculated by engineering codes (TVS-M, PERMAK-A). The multiplication factors and power distributions of FA and VVER with infinite height are calculated in this work by the Monte-Carlo code MCU using earlier prepared isotopic compositions. The MCU calculation data were compared with the data which were obtained by engineering codes.

  1. An analytical demonstration of coupling schemes between magnetohydrodynamic codes and eddy current codes

    International Nuclear Information System (INIS)

    Liu Yueqiang; Albanese, R.; Rubinacci, G.; Portone, A.; Villone, F.

    2008-01-01

    In order to model a magnetohydrodynamic (MHD) instability that strongly couples to external conducting structures (walls and/or coils) in a fusion device, it is often necessary to combine a MHD code solving for the plasma response, with an eddy current code computing the fields and currents of conductors. We present a rigorous proof of the coupling schemes between these two types of codes. One of the coupling schemes has been introduced and implemented in the CARMA code [R. Albanese, Y. Q. Liu, A. Portone, G. Rubinacci, and F. Villone, IEEE Trans. Magn. 44, 1654 (2008); A. Portone, F. Villone, Y. Q. Liu, R. Albanese, and G. Rubinacci, Plasma Phys. Controlled Fusion 50, 085004 (2008)] that couples the MHD code MARS-F[Y. Q. Liu, A. Bondeson, C. M. Fransson, B. Lennartson, and C. Breitholtz, Phys. Plasmas 7, 3681 (2000)] and the eddy current code CARIDDI[R. Albanese and G. Rubinacci, Adv. Imaging Electron Phys. 102, 1 (1998)]. While the coupling schemes are described for a general toroidal geometry, we give the analytical proof for a cylindrical plasma.

  2. Preliminary investigation study of code of developed country for developing Korean fuel cycle code

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Ko, Won Il; Lee, Ho Hee; Cho, Dong Keun; Park, Chang Je

    2012-01-01

    In order to develop Korean fuel cycle code, the analyses has been performed with the fuel cycle codes which are used in advanced country. Also, recommendations were proposed for future development. The fuel cycle codes are AS FLOOWS: VISTA which has been developed by IAEA, DANESS code which developed by ANL and LISTO, and VISION developed by INL for the Advanced Fuel Cycle Initiative (AFCI) system analysis. The recommended items were proposed for software, program scheme, material flow model, isotope decay model, environmental impact analysis model, and economics analysis model. The described things will be used for development of Korean nuclear fuel cycle code in future

  3. The Coding Process and Its Challenges

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2010-02-01

    Full Text Available Coding is the core process in classic grounded theory methodology. It is through coding that the conceptual abstraction of data and its reintegration as theory takes place. There are two types of coding in a classic grounded theory study: substantive coding, which includes both open and selective coding procedures, and theoretical coding. In substantive coding, the researcher works with the data directly, fracturing and analysing it, initially through open coding for the emergence of a core category and related concepts and then subsequently through theoretical sampling and selective coding of data to theoretically saturate the core and related concepts. Theoretical saturation is achieved through constant comparison of incidents (indicators in the data to elicit the properties and dimensions of each category (code. This constant comparing of incidents continues until the process yields the interchangeability of indicators, meaning that no new properties or dimensions are emerging from continued coding and comparison. At this point, the concepts have achieved theoretical saturation and the theorist shifts attention to exploring the emergent fit of potential theoretical codes that enable the conceptual integration of the core and related concepts to produce hypotheses that account for relationships between the concepts thereby explaining the latent pattern of social behaviour that forms the basis of the emergent theory. The coding of data in grounded theory occurs in conjunction with analysis through a process of conceptual memoing, capturing the theorist’s ideation of the emerging theory. Memoing occurs initially at the substantive coding level and proceeds to higher levels of conceptual abstraction as coding proceeds to theoretical saturation and the theorist begins to explore conceptual reintegration through theoretical coding.

  4. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  5. The Code of Ethics and Editorial Code of Practice of the Royal Astronomical Society

    Science.gov (United States)

    Murdin, Paul

    2013-01-01

    Whilst the Royal Astronomical Society has got by for more than 100 years without a written code of ethics, modern standards of governance suggested that such a code could be useful in the resolution of disputes. In 2005, the RAS adopted the Universal Code of Ethics for Science that had been formulated by the Royal Society of London. At the same time and for similar reasons the RAS adopted an Editorial Code of Practice.

  6. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  7. Generalized optical code construction for enhanced and Modified Double Weight like codes without mapping for SAC-OCDMA systems

    Science.gov (United States)

    Kumawat, Soma; Ravi Kumar, M.

    2016-07-01

    Double Weight (DW) code family is one of the coding schemes proposed for Spectral Amplitude Coding-Optical Code Division Multiple Access (SAC-OCDMA) systems. Modified Double Weight (MDW) code for even weights and Enhanced Double Weight (EDW) code for odd weights are two algorithms extending the use of DW code for SAC-OCDMA systems. The above mentioned codes use mapping technique to provide codes for higher number of users. A new generalized algorithm to construct EDW and MDW like codes without mapping for any weight greater than 2 is proposed. A single code construction algorithm gives same length increment, Bit Error Rate (BER) calculation and other properties for all weights greater than 2. Algorithm first constructs a generalized basic matrix which is repeated in a different way to produce the codes for all users (different from mapping). The generalized code is analysed for BER using balanced detection and direct detection techniques.

  8. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  9. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    International Nuclear Information System (INIS)

    Mori, Takamasa; Nakagawa, Masayuki; Kaneko, Kunio.

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author)

  10. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  11. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Takamasa; Nakagawa, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author).

  12. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  13. Radioactive action code

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    A new coding system, 'Hazrad', for buildings and transportation containers for alerting emergency services personnel to the presence of radioactive materials has been developed in the United Kingdom. The hazards of materials in the buildings or transport container, together with the recommended emergency action, are represented by a number of codes which are marked on the building or container and interpreted from a chart carried as a pocket-size guide. Buildings would be marked with the familiar yellow 'radioactive' trefoil, the written information 'Radioactive materials' and a list of isotopes. Under this the 'Hazrad' code would be written - three symbols to denote the relative radioactive risk (low, medium or high), the biological risk (also low, medium or high) and the third showing the type of radiation emitted, alpha, beta or gamma. The response cards indicate appropriate measures to take, eg for a high biological risk, Bio3, the wearing of a gas-tight protection suit is advised. The code and its uses are explained. (U.K.)

  14. Software information sorting code 'PLUTO-R'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Naraoka, Kenitsu; Adachi, Masao; Takeda, Tatsuoki

    1984-10-01

    A software information sorting code PLUTO-R is developed as one of the supporting codes of the TRITON system for the fusion plasma analysis. The objective of the PLUTO-R code is to sort reference materials of the codes in the TRITON code system. The easiness in the registration of information is especially pursued. As experience and skill in the data registration are not required, this code is usable for construction of general small-scale information system. This report gives an overall description and the user's manual of the PLUTO-R code. (author)

  15. Quantum computing with Majorana fermion codes

    Science.gov (United States)

    Litinski, Daniel; von Oppen, Felix

    2018-05-01

    We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.

  16. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  17. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  18. Stability analysis by ERATO code

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Matsuura, Toshihiko; Azumi, Masafumi; Kurita, Gen-ichi

    1979-12-01

    Problems in MHD stability calculations by ERATO code are described; which concern convergence property of results, equilibrium codes, and machine optimization of ERATO code. It is concluded that irregularity on a convergence curve is not due to a fault of the ERATO code itself but due to inappropriate choice of the equilibrium calculation meshes. Also described are a code to calculate an equilibrium as a quasi-inverse problem and a code to calculate an equilibrium as a result of a transport process. Optimization of the code with respect to I/O operations reduced both CPU time and I/O time considerably. With the FACOM230-75 APU/CPU multiprocessor system, the performance is about 6 times as high as with the FACOM230-75 CPU, showing the effectiveness of a vector processing computer for the kind of MHD computations. This report is a summary of the material presented at the ERATO workshop 1979(ORNL), supplemented with some details. (author)

  19. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  20. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  1. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  2. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  3. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  4. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  5. Quick response codes in Orthodontics

    Directory of Open Access Journals (Sweden)

    Moidin Shakil

    2015-01-01

    Full Text Available Quick response (QR code codes are two-dimensional barcodes, which encodes for a large amount of information. QR codes in Orthodontics are an innovative approach in which patient details, radiographic interpretation, and treatment plan can be encoded. Implementing QR code in Orthodontics will save time, reduces paperwork, and minimizes manual efforts in storage and retrieval of patient information during subsequent stages of treatment.

  6. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  7. Energy Code Enforcement Training Manual : Covering the Washington State Energy Code and the Ventilation and Indoor Air Quality Code.

    Energy Technology Data Exchange (ETDEWEB)

    Washington State Energy Code Program

    1992-05-01

    This manual is designed to provide building department personnel with specific inspection and plan review skills and information on provisions of the 1991 edition of the Washington State Energy Code (WSEC). It also provides information on provisions of the new stand-alone Ventilation and Indoor Air Quality (VIAQ) Code.The intent of the WSEC is to reduce the amount of energy used by requiring energy-efficient construction. Such conservation reduces energy requirements, and, as a result, reduces the use of finite resources, such as gas or oil. Lowering energy demand helps everyone by keeping electricity costs down. (It is less expensive to use existing electrical capacity efficiently than it is to develop new and additional capacity needed to heat or cool inefficient buildings.) The new VIAQ Code (effective July, 1991) is a natural companion to the energy code. Whether energy-efficient or not, an homes have potential indoor air quality problems. Studies have shown that indoor air is often more polluted than outdoor air. The VIAQ Code provides a means of exchanging stale air for fresh, without compromising energy savings, by setting standards for a controlled ventilation system. It also offers requirements meant to prevent indoor air pollution from building products or radon.

  8. Noncoherent Spectral Optical CDMA System Using 1D Active Weight Two-Code Keying Codes

    Directory of Open Access Journals (Sweden)

    Bih-Chyun Yeh

    2016-01-01

    Full Text Available We propose a new family of one-dimensional (1D active weight two-code keying (TCK in spectral amplitude coding (SAC optical code division multiple access (OCDMA networks. We use encoding and decoding transfer functions to operate the 1D active weight TCK. The proposed structure includes an optical line terminal (OLT and optical network units (ONUs to produce the encoding and decoding codes of the proposed OLT and ONUs, respectively. The proposed ONU uses the modified cross-correlation to remove interferences from other simultaneous users, that is, the multiuser interference (MUI. When the phase-induced intensity noise (PIIN is the most important noise, the modified cross-correlation suppresses the PIIN. In the numerical results, we find that the bit error rate (BER for the proposed system using the 1D active weight TCK codes outperforms that for two other systems using the 1D M-Seq codes and 1D balanced incomplete block design (BIBD codes. The effective source power for the proposed system can achieve −10 dBm, which has less power than that for the other systems.

  9. The reason for having a code of pharmaceutical ethics: Spanish Pharmacists Code of Ethics.

    Science.gov (United States)

    Barreda Hernández, Dolores; Mulet Alberola, Ana; González Bermejo, Diana; Soler Company, Enrique

    2017-05-01

    The pharmacist profession needs its own code of conduct set out in writing to serve as a stimulus to pharmacists in their day-to-day work in the different areas of pharmacy, in conjunction always with each individual pharmacist´s personal commitment to their patients, to other healthcare professionals and to society. An overview is provided of the different codes of ethics for pharmacists on the national and international scale, the most up-to-date code for 2015 being presented as a set of principles which must guide a pharmacutical conduct from the standpoint of deliberative judgment. The difference between codes of ethics and codes of practice is discussed. In the era of massive-scale collaboration, this code is a project holding bright prospects for the future. Each individual pharmacutical attitude in practicing their profession must be identified with the pursuit of excellence in their own personal practice for the purpose of achieving the ethical and professional values above and beyond complying with regulations and code of practice. Copyright AULA MEDICA EDICIONES 2017. Published by AULA MEDICA. All rights reserved.

  10. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  11. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  12. Explicit MDS Codes with Complementary Duals

    DEFF Research Database (Denmark)

    Beelen, Duals Peter; Jin, Lingfei

    2018-01-01

    In 1964, Massey introduced a class of codes with complementary duals which are called Linear Complimentary Dual (LCD for short) codes. He showed that LCD codes have applications in communication system, side-channel attack (SCA) and so on. LCD codes have been extensively studied in literature....... On the other hand, MDS codes form an optimal family of classical codes which have wide applications in both theory and practice. The main purpose of this paper is to give an explicit construction of several classes of LCD MDS codes, using tools from algebraic function fields. We exemplify this construction...

  13. Airborne microbial emissions and immissions on aerogic mechanical-biological waste treatment plants; Luftgetragene mikrobielle Emissionen und Immissionen an aeroben mechanisch-biologischen Abfallbehandlungsanlagen

    Energy Technology Data Exchange (ETDEWEB)

    Luft, C.

    2002-07-01

    During biological waste treatment it is important to consider the hygienic situation. One has to take care that citizens in the neighborhood and especially the work force complain about impairments caused by microbial immissions. Therefore it is important to evaluate microbial emissions and immissions of composting plants. This dissertation looked upon this topic. Microbial and endotoxin emissions of different biological waste treatment plants were measured with diverse sampling methods. The research was done on enclosed and open variants of plants. Measurements were taken from different composting techniques and also from a plant treating the rest fraction of household waste. Depending on the technique researched different concentrations of airborne microbes could be found. The size of the plant and degree of enclosure as well as the material input all affect the amount of airborne microbial emissions. At a small open composting plant (6 500 Mg/a) only low microbial concentrations could be found at the workplace, while at the totally enclosed plant (12 000 Mg/a) high concentrations of airborne microorganisms could be observed at the workplace. Seasonal differences in microbial concentrations could not be seen when considering the agitation of outdoor piles consisting of separated household waste. In contrast, measured concentrations of endotoxins at another composting plant showed seasonal differences. Using simulations based on the models of TA-Luft and VDI 3783 it could be calculated that emissions from enclosed plants with 12 000 Mg/a input and a biofilter have a minimal influence on the neighborhood of the composting plant. (orig.) [German] Beim Umgang mit biologischen Abfaellen spielt die hygienische Situation eine wichtige Rolle. Besonders im Bereich des Arbeitsschutzes, aber auch im Hinblick auf die in der Naehe von Abfallbehandlungsanlagen wohnenden Personen, ist Sorge zu tragen, dass es nicht zu gesundheitlichen Beeintraechtigungen durch Keimimmissionen

  14. Serial-data correlator/code translator

    Science.gov (United States)

    Morgan, L. E.

    1977-01-01

    System, consisting of sampling flip flop, memory (either RAM or ROM), and memory buffer, correlates sampled data with predetermined acceptance code patterns, translates acceptable code patterns to nonreturn-to-zero code, and identifies data dropouts.

  15. Circular codes revisited: a statistical approach.

    Science.gov (United States)

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  17. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  18. Australasian code for reporting of mineral resources and ore reserves (the JORC code)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The latest revision of the Code first published in 1989 becomes effective in September 1999. It was prepared by the Joint Ores Reserves Committee of the Australasian Institute of Mining and Metallurgy, Australian Institute of Geoscientists and Minerals Council of Australia (JORC). It sets out minimum standards, recommendations and guidelines for public reporting of exploration results, mineral resources and ore reserves in Australasia. In this edition, the guidelines, which were previously separated from the Code, have been placed after the respective Code clauses. The Code is applicable to all solid minerals, including diamonds, other gemstones and coal for which public reporting is required by the Australian and New Zealand Stock Exchanges.

  19. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  20. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  1. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M. Jr.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  2. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially-available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  3. Data exchange between zero dimensional code and physics platform in the CFETR integrated system code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Guoliang [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Shi, Nan [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Zhou, Yifu; Mao, Shifeng [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Jian, Xiang [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, School of Electrical and Electronics Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Jiale [Institute of Plasma Physics, Chinese Academy of Sciences, No. 350 Shushanhu Road, Hefei (China); Liu, Li; Chan, Vincent [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China); Ye, Minyou, E-mail: yemy@ustc.edu.cn [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei 230026 China (China)

    2016-11-01

    Highlights: • The workflow of the zero dimensional code and the multi-dimension physics platform of CFETR integrated system codeis introduced. • The iteration process among the codes in the physics platform. • The data transfer between the zero dimensionalcode and the physical platform, including data iteration and validation, and justification for performance parameters.. - Abstract: The China Fusion Engineering Test Reactor (CFETR) integrated system code contains three parts: a zero dimensional code, a physics platform and an engineering platform. We use the zero dimensional code to identify a set of preliminary physics and engineering parameters for CFETR, which is used as input to initiate multi-dimension studies using the physics and engineering platform for design, verification and validation. Effective data exchange between the zero dimensional code and the physical platform is critical for the optimization of CFETR design. For example, in evaluating the impact of impurity radiation on core performance, an open field line code is used to calculate the impurity transport from the first-wall boundary to the pedestal. The impurity particle in the pedestal are used as boundary conditions in a transport code for calculating impurity transport in the core plasma and the impact of core radiation on core performance. Comparison of the results from the multi-dimensional study to those from the zero dimensional code is used to further refine the controlled radiation model. The data transfer between the zero dimensional code and the physical platform, including data iteration and validation, and justification for performance parameters will be presented in this paper.

  4. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  5. Code of Practice containing definitions for Safety Codes of Practice for nuclear power plants

    International Nuclear Information System (INIS)

    1979-01-01

    This Code provides definitions of the technical terms used in the licensing applications to be submitted to the Turkish Atomic Energy Commission (TAEC), in accordance with national licensing regulations. The Code is based mainly on the International Atomic Energy Agency's Code of Practice on the subject. (NEA) [fr

  6. The Minimum Distance of Graph Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2011-01-01

    We study codes constructed from graphs where the code symbols are associated with the edges and the symbols connected to a given vertex are restricted to be codewords in a component code. In particular we treat such codes from bipartite expander graphs coming from Euclidean planes and other...... geometries. We give results on the minimum distances of the codes....

  7. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  8. Quasi-cyclic unit memory convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Paaske, Erik; Ballan, Mark

    1990-01-01

    Unit memory convolutional codes with generator matrices, which are composed of circulant submatrices, are introduced. This structure facilitates the analysis of efficient search for good codes. Equivalences among such codes and some of the basic structural properties are discussed. In particular......, catastrophic encoders and minimal encoders are characterized and dual codes treated. Further, various distance measures are discussed, and a number of good codes, some of which result from efficient computer search and some of which result from known block codes, are presented...

  9. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  10. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  11. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    Science.gov (United States)

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  12. ESCADRE and ICARE code systems

    International Nuclear Information System (INIS)

    Reocreux, M.; Gauvain, J.

    1992-01-01

    The French sever accident code development program is following two parallel approaches: the first one is dealing with ''integral codes'' which are designed for giving immediate engineer answers, the second one is following a more mechanistic way in order to have the capability of detailed analysis of experiments, in order to get a better understanding of the scaling problem and reach a better confidence in plant calculations. In the first approach a complete system has been developed and is being used for practical cases: this is the ESCADRE system. In the second approach, a set of codes dealing first with primary circuit is being developed: a mechanistic core degradation code, ICARE, has been issued and is being coupled with the advanced thermalhydraulic code CATHARE. Fission product codes have been also coupled to CATHARE. The ''integral'' ESCADRE system and the mechanistic ICARE and associated codes are described. Their main characteristics are reviewed and the status of their development and assessment given. Future studies are finally discussed. 36 refs, 4 figs, 1 tab

  13. Clean Code - Why you should care

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    - Martin Fowler Writing code is communication, not solely with the computer that executes it, but also with other developers and with oneself. A developer spends a lot of his working time reading and understanding code that was written by other developers or by himself in the past. The readability of the code plays an important factor for the time to find a bug or add new functionality, which in turn has a big impact on the productivity. Code that is difficult to undestand, hard to maintain and refactor, and offers many spots for bugs to hide is not considered to be "clean code". But what could considered as "clean code" and what are the advantages of a strict application of its guidelines? In this presentation we will take a look on some typical "code smells" and proposed guidelines to improve your coding skills to write cleaner code that is less bug prone and better to maintain.

  14. Burnup code for fuel assembly by Monte Carlo code. MKENO-BURN

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Suyama, Kenya; Masukawa, Fumihiro; Matsumoto, Kiyoshi; Kurosawa, Masayoshi; Kaneko, Toshiyuki.

    1996-12-01

    The evaluation of neutron spectrum is so important for burnup calculation of the heterogeneous geometry like recent BWR fuel assembly. MKENO-BURN is a multi dimensional burnup code that based on the three dimensional monte carlo neutron transport code 'MULTI-KENO' and the routine for the burnup calculation of the one dimensional burnup code 'UNITBURN'. MKENO-BURN analyzes the burnup problem of arbitrary regions after evaluating the neutron spectrum and making one group cross section in three dimensional geometry with MULTI-KENO. It enables us to do three dimensional burnup calculation. This report consists of general description of MKENO-BURN and the input data. (author)

  15. Burnup code for fuel assembly by Monte Carlo code. MKENO-BURN

    Energy Technology Data Exchange (ETDEWEB)

    Naito, Yoshitaka; Suyama, Kenya; Masukawa, Fumihiro; Matsumoto, Kiyoshi; Kurosawa, Masayoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Toshiyuki

    1996-12-01

    The evaluation of neutron spectrum is so important for burnup calculation of the heterogeneous geometry like recent BWR fuel assembly. MKENO-BURN is a multi dimensional burnup code that based on the three dimensional monte carlo neutron transport code `MULTI-KENO` and the routine for the burnup calculation of the one dimensional burnup code `UNITBURN`. MKENO-BURN analyzes the burnup problem of arbitrary regions after evaluating the neutron spectrum and making one group cross section in three dimensional geometry with MULTI-KENO. It enables us to do three dimensional burnup calculation. This report consists of general description of MKENO-BURN and the input data. (author)

  16. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  17. An implicit Smooth Particle Hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Charles E. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  18. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  19. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  20. Polarization diversity scheme on spectral polarization coding optical code-division multiple-access network

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chang, Yao-Tang; Chen, Bo-Hau

    2010-12-01

    We present an experiment demonstrating the spectral-polarization coding optical code-division multiple-access system introduced with a nonideal state of polarization (SOP) matching conditions. In the proposed system, the encoding and double balanced-detection processes are implemented using a polarization-diversity scheme. Because of the quasiorthogonality of Hadamard codes combining with array waveguide grating routers and a polarization beam splitter, the proposed codec pair can encode-decode multiple code words of Hadamard code while retaining the ability for multiple-access interference cancellation. The experimental results demonstrate that when the system is maintained with an orthogonal SOP for each user, an effective reduction in the phase-induced intensity noise is obtained. The analytical SNR values are found to overstate the experimental results by around 2 dB when the received effective power is large. This is mainly limited by insertion losses of components and a nonflattened optical light source. Furthermore, the matching conditions can be improved by decreasing nonideal influences.

  1. Inner surface modification of a tube by magnetic glow-arc plasma source ion implantation

    International Nuclear Information System (INIS)

    Zhang Guling; Chinese Academy of Sciences, Beijing; Wang Jiuli; Feng Wenran; Chen Guangliang; Gu Weichao; Niu Erwu; Fan Songhua; Liu Chizi; Yang Size; Wu Xingfang

    2006-01-01

    A new method named the magnetic glow-arc plasma source ion implantation (MGA-PSII) is proposed for inner surface modification of tubes. In MGA-PSII, under the control of an axial magnetic field, which is generated by an electric coil around the tube sample, glow arc plasma moves spirally into the tube from its two ends. A negative voltage applied on the tube realized its inner surface implantation. Titanium nitride (TiN) films are prepared on the inner surface of a stainless steel tube in diameter 90 mm and length 600 mm. Hardness tests show that the hardness at the tube centre is up to 20 GPa. XRD, XPS and AES analyses demonstrate that good quality of TiN films can be achieved. (authors)

  2. Inner Surface Modification of a Tube by Magnetic Glow-Arc Plasma Source Ion Implantation

    Science.gov (United States)

    Zhang, Gu-Ling; Wang, Jiu-Li; Wu, Xing-Fang; Feng, Wen-Ran; Chen, Guang-Liang; Gu, Wei-Chao; Niu, Er-Wu; Fan, Song-Hua; Liu, Chi-Zi; Yang, Si-Ze

    2006-05-01

    A new method named the magnetic glow-arc plasma source ion implantation (MGA-PSII) is proposed for inner surface modification of tubes. In MGA-PSII, under the control of an axial magnetic field, which is generated by an electric coil around the tube sample, glow arc plasma moves spirally into the tube from its two ends. A negative voltage applied on the tube realized its inner surface implantation. Titanium nitride (TiN) films are prepared on the inner surface of a stainless steel tube in diameter 90 mm and length 600 mm. Hardness tests show that the hardness at the tube centre is up to 20 GPa. XRD, XPS and AES analyses demonstrate that good quality of TiN films can be achieved.

  3. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  4. Tokamak plasma power balance calculation code (TPC code) outline and operation manual

    International Nuclear Information System (INIS)

    Fujieda, Hirobumi; Murakami, Yoshiki; Sugihara, Masayoshi.

    1992-11-01

    This report is a detailed description on the TPC code, that calculates the power balance of a tokamak plasma according to the ITER guidelines. The TPC code works on a personal computer (Macintosh or J-3100/ IBM-PC). Using input data such as the plasma shape, toroidal magnetic field, plasma current, electron temperature, electron density, impurities and heating power, TPC code can determine the operation point of the fusion reactor (Ion temperature is assumed to be equal to the electron temperature). Supplied flux (Volt · sec) and burn time are also estimated by coil design parameters. Calculated energy confinement time is compared with various L-mode scaling laws and the confinement enhancement factor (H-factor) is evaluated. Divertor heat load is predicted by using simple scaling models (constant-χ, Bohm-type-χ and JT-60U empirical scaling models). Frequently used data can be stored in a 'device file' and used as the default values. TPC code can generate 2-D mesh data and the POPCON plot is drawn by a contour line plotting program (CONPLT). The operation manual about CONPLT code is also described. (author)

  5. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  6. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  7. A new two dimensional spectral/spatial multi-diagonal code for noncoherent optical code division multiple access (OCDMA) systems

    Science.gov (United States)

    Kadhim, Rasim Azeez; Fadhil, Hilal Adnan; Aljunid, S. A.; Razalli, Mohamad Shahrazel

    2014-10-01

    A new two dimensional codes family, namely two dimensional multi-diagonal (2D-MD) codes, is proposed for spectral/spatial non-coherent OCDMA systems based on the one dimensional MD code. Since the MD code has the property of zero cross correlation, the proposed 2D-MD code also has this property. So that, the multi-access interference (MAI) is fully eliminated and the phase induced intensity noise (PIIN) is suppressed with the proposed code. Code performance is analyzed in terms of bit error rate (BER) while considering the effect of shot noise, PIIN, and thermal noise. The performance of the proposed code is compared with the related MD, modified quadratic congruence (MQC), two dimensional perfect difference (2D-PD) and two dimensional diluted perfect difference (2D-DPD) codes. The analytical and the simulation results reveal that the proposed 2D-MD code outperforms the other codes. Moreover, a large number of simultaneous users can be accommodated at low BER and high data rate.

  8. RH-TRU Waste Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2007-07-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  9. Remote-Handled Transuranic Content Codes

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions

    2006-12-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document describes the inventory of RH-TRU waste within the transportation parameters specified by the Remote-Handled Transuranic Waste Authorized Methods for Payload Control (RH-TRAMPAC).1 The RH-TRAMPAC defines the allowable payload for the RH-TRU 72-B. This document is a catalog of RH-TRU 72-B authorized contents by site. A content code is defined by the following components: • A two-letter site abbreviation that designates the physical location of the generated/stored waste (e.g., ID for Idaho National Laboratory [INL]). The site-specific letter designations for each of the sites are provided in Table 1. • A three-digit code that designates the physical and chemical form of the waste (e.g., content code 317 denotes TRU Metal Waste). For RH-TRU waste to be transported in the RH-TRU 72-B, the first number of this three-digit code is “3.” The second and third numbers of the three-digit code describe the physical and chemical form of the waste. Table 2 provides a brief description of each generic code. Content codes are further defined as subcodes by an alpha trailer after the three-digit code to allow segregation of wastes that differ in one or more parameter(s). For example, the alpha trailers of the subcodes ID 322A and ID 322B may be used to differentiate between waste packaging configurations. As detailed in the RH-TRAMPAC, compliance with flammable gas limits may be demonstrated through the evaluation of compliance with either a decay heat limit or flammable gas generation rate (FGGR) limit per container specified in approved content codes. As applicable, if a container meets the watt*year criteria specified by the RH-TRAMPAC, the decay heat limits based on the dose-dependent G value may be used as specified in an approved content code. If a site implements the administrative controls outlined in the RH-TRAMPAC and Appendix 2.4 of the RH-TRU Payload Appendices, the decay heat or FGGR

  10. Optical code division multiple access secure communications systems with rapid reconfigurable polarization shift key user code

    Science.gov (United States)

    Gao, Kaiqiang; Wu, Chongqing; Sheng, Xinzhi; Shang, Chao; Liu, Lanlan; Wang, Jian

    2015-09-01

    An optical code division multiple access (OCDMA) secure communications system scheme with rapid reconfigurable polarization shift key (Pol-SK) bipolar user code is proposed and demonstrated. Compared to fix code OCDMA, by constantly changing the user code, the performance of anti-eavesdropping is greatly improved. The Pol-SK OCDMA experiment with a 10 Gchip/s user code and a 1.25 Gb/s user data of payload has been realized, which means this scheme has better tolerance and could be easily realized.

  11. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  12. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  13. Developments of HTGR thermofluid dynamic analysis codes and HTGR plant dynamic simulation code

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1983-01-01

    In nuclear power plants as well as high temperature gas-cooled reactor plants, the design is mostly performed on the basis of the results after their characteristics have been grasped by carrying out the numerical simulation using the analysis code. Also in Kawasaki Heavy Industries Ltd., on the basis of the system engineering accumulated with gas-cooled reactors since several years ago, the preparation and systematization of analysis codes have been advanced, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In this report, a part of the results is described. The example of the analysis applying the two-dimensional compressible flow analysis codes SOLA-VOF and SALE-2D, which were developed by Los Alamos National Laboratory in USA and modified for use in Kawasaki, to HTGR system is reported. Besides, Kawasaki has developed the control characteristics analyzing code DYSCO by which the change of system composition is easy and high versatility is available. The outline, fundamental equations, fundamental algorithms and examples of application of the SOLA-VOF and SALE-2D, the present status of system characteristic simulation codes and the outline of the DYSCO are described. (Kako, I.)

  14. Interrelations of codes in human semiotic systems.

    OpenAIRE

    Somov, Georgij

    2016-01-01

    Codes can be viewed as mechanisms that enable relations of signs and their components, i.e., semiosis is actualized. The combinations of these relations produce new relations as new codes are building over other codes. Structures appear in the mechanisms of codes. Hence, codes can be described as transformations of structures from some material systems into others. Structures belong to different carriers, but exist in codes in their "pure" form. Building of codes over other codes fosters t...

  15. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    Science.gov (United States)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  16. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    Energy Technology Data Exchange (ETDEWEB)

    Langenbuch, S.; Velkov, K. [GRS, Garching (Germany); Lizorkin, M. [Kurchatov-Institute, Moscow (Russian Federation)] [and others

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  17. R-matrix analysis code (RAC)

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Qi Huiquan

    1990-01-01

    A comprehensive R-matrix analysis code has been developed. It is based on the multichannel and multilevel R-matrix theory and runs in VAX computer with FORTRAN-77. With this code many kinds of experimental data for one nuclear system can be fitted simultaneously. The comparisions between code RAC and code EDA of LANL are made. The data show both codes produced the same calculation results when one set of R-matrix parameters was used. The differential cross section of 10 B (n, α) 7 Li for E n = 0.4 MeV and the polarization of 16 O (n,n) 16 O for E n = 2.56 MeV are presented

  18. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  19. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  20. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.; Bensmail, H.; Yao, N.; Gao, Xin

    2013-01-01

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  1. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas

    2011-08-01

    Fixed-rate and rateless channel codes are generally treated separately in the related research literature and so, a novice in the field inevitably gets the impression that these channel codes are unrelated. By contrast, in this treatise, we endeavor to further develop a link between the traditional fixed-rate codes and the recently developed rateless codes by delving into their underlying attributes. This joint treatment is beneficial for two principal reasons. First, it facilitates the task of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well-understood codedesign tools-originally contrived for fixed-rate codes-to the realm of rateless codes. Indeed, these versatile tools proved to be vital in the design of diverse fixed-rate-coded communications systems, and thus our hope is that they will further elucidate the associated performance ramifications of the rateless coded schemes. © 2011 IEEE.

  2. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  3. A class of Sudan-decodable codes

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Refslund

    2000-01-01

    In this article, Sudan's algorithm is modified into an efficient method to list-decode a class of codes which can be seen as a generalization of Reed-Solomon codes. The algorithm is specialized into a very efficient method for unique decoding. The code construction can be generalized based...... on algebraic-geometry codes and the decoding algorithms are generalized accordingly. Comparisons with Reed-Solomon and Hermitian codes are made....

  4. Light-water reactor safety analysis codes

    International Nuclear Information System (INIS)

    Jackson, J.F.; Ransom, V.H.; Ybarrondo, L.J.; Liles, D.R.

    1980-01-01

    A brief review of the evolution of light-water reactor safety analysis codes is presented. Included is a summary comparison of the technical capabilities of major system codes. Three recent codes are described in more detail to serve as examples of currently used techniques. Example comparisons between calculated results using these codes and experimental data are given. Finally, a brief evaluation of current code capability and future development trends is presented

  5. LFSC - Linac Feedback Simulation Code

    International Nuclear Information System (INIS)

    Ivanov, Valentin; Fermilab

    2008-01-01

    The computer program LFSC ( ) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output

  6. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  7. The CORSYS neutronics code system

    International Nuclear Information System (INIS)

    Caner, M.; Krumbein, A.D.; Saphier, D.; Shapira, M.

    1994-01-01

    The purpose of this work is to assemble a code package for LWR core physics including coupled neutronics, burnup and thermal hydraulics. The CORSYS system is built around the cell code WIMS (for group microscopic cross section calculations) and 3-dimension diffusion code CITATION (for burnup and fuel management). We are implementing such a system on an IBM RS-6000 workstation. The code was rested with a simplified model of the Zion Unit 2 PWR. (authors). 6 refs., 8 figs., 1 tabs

  8. WWER reactor physics code applications

    International Nuclear Information System (INIS)

    Gado, J.; Kereszturi, A.; Gacs, A.; Telbisz, M.

    1994-01-01

    The coupled steady-state reactor physics and thermohydraulic code system KARATE has been developed and applied for WWER-1000 and WWER-440 operational calculations. The 3 D coupled kinetic code KIKO3D has been developed and validated for WWER-440 accident analysis applications. The coupled kinetic code SMARTA developed by VTT Helsinki has been applied for WWER-440 accident analysis. The paper gives a summary of the experience in code development and application. (authors). 10 refs., 2 tabs., 5 figs

  9. The LIONS code (version 1.0)

    International Nuclear Information System (INIS)

    Bertrand, P.

    1993-01-01

    The new LIONS code (Lancement d'IONS or Ion Launching), a dynamical code implemented in the SPIRaL project for the CIME cyclotron studies, is presented. The various software involves a 3D magnetostatic code, 2D or 3D electrostatic codes for generation of realistic field maps, and several dynamical codes for studying the behaviour of the reference particle from the cyclotron center up to the ejection and for launching particles packets complying with given correlations. Its interactions with the other codes are described. The LIONS code, written in Fortran 90 is already used in studying the CIME cyclotron, from the center to the ejection. It is designed to be used, with minor modifications, in other contexts such as for the simulation of mass spectrometer facilities

  10. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code

    International Nuclear Information System (INIS)

    Ramsthaler, J.A.; Lime, J.F.; Sahota, M.S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A

  11. The code of ethics for nurses.

    Science.gov (United States)

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  12. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  13. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  14. The Coding Question.

    Science.gov (United States)

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. BYOD och Cloud Computing : En vägledande studie för hur BYOD bör bedrivas säkert tillsammans med cloud computing

    OpenAIRE

    Abramian, Fredrik; Wedholm, Joel

    2013-01-01

    Syftet med denna uppsats är att få en insikt i hur informationssäkerhetsarbetet påverkas av Bring Your Own Device (BYOD) i kombination med cloud computing. Författarna har kommit fram till sin slutsats genom att utföra intervjuer på två olika verksamheter samt en enkätundersökning som sedan har analyserats med aktuell teori. Utifrån detta har en kravspecifikation tagits fram som ett vägledande bidrag för organisationer. The aim of this thesis is to gain a richer insight into how informati...

  16. Code package to analyse behavior of the WWER fuel rods in normal operation: TOPRA's code

    International Nuclear Information System (INIS)

    Scheglov, A.; Proselkov, V.

    2001-01-01

    This paper briefly describes the code package intended for analysis of WWER fuel rod characteristics. The package includes two computer codes: TOPRA-1 and TOPRA-2 for full-scale fuel rod analyses; MRZ and MKK codes for analyzing the separate sections of fuel rods in r-z and r-j geometry. The TOPRA's codes are developed on the base of PIN-mod2 version and verified against experimental results obtained in MR, MIR and Halden research reactors (in the framework of SOFIT, FGR-2 and FUMEX experimental programs). Comparative analysis of calculation results and results from post-reactor examination of the WWER-440 and WWER-1000 fuel rod are also made as additional verification of these codes. To avoid the enlarging of uncertainties in fuel behavior prediction as a result of simplifying of the fuel geometry, MKK and MRZ codes are developed on the basis of the finite element method with use of the three nodal finite elements. Results obtained in the course of the code verification indicate the possibility for application of the method and TOPRA's code for simplified engineering calculations of WWER fuel rods thermal-physical parameters. An analysis of maximum relative errors for predicting of the fuel rod characteristics in the range of the accepted parameter values is also presented in the paper

  17. Quantum computation with Turaev-Viro codes

    International Nuclear Information System (INIS)

    Koenig, Robert; Kuperberg, Greg; Reichardt, Ben W.

    2010-01-01

    For a 3-manifold with triangulated boundary, the Turaev-Viro topological invariant can be interpreted as a quantum error-correcting code. The code has local stabilizers, identified by Levin and Wen, on a qudit lattice. Kitaev's toric code arises as a special case. The toric code corresponds to an abelian anyon model, and therefore requires out-of-code operations to obtain universal quantum computation. In contrast, for many categories, such as the Fibonacci category, the Turaev-Viro code realizes a non-abelian anyon model. A universal set of fault-tolerant operations can be implemented by deforming the code with local gates, in order to implement anyon braiding. We identify the anyons in the code space, and present schemes for initialization, computation and measurement. This provides a family of constructions for fault-tolerant quantum computation that are closely related to topological quantum computation, but for which the fault tolerance is implemented in software rather than coming from a physical medium.

  18. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  19. Dual Coding, Reasoning and Fallacies.

    Science.gov (United States)

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  20. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  1. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  2. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  3. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  4. The 1996 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1996-01-01

    The codes are named 'the Pre-processing' codes, because they are designed to pre-process ENDF/B data, for later, further processing for use in applications. This is a modular set of computer codes, each of which reads and writes evaluated nuclear data in the ENDF/B format. Each code performs one or more independent operations on the data, as described below. These codes are designed to be computer independent, and are presently operational on every type of computer from large mainframe computer to small personal computers, such as IBM-PC and Power MAC. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  5. User manual of UNF code

    International Nuclear Information System (INIS)

    Zhang Jingshang

    2001-01-01

    The UNF code (2001 version) written in FORTRAN-90 is developed for calculating fast neutron reaction data of structure materials with incident energies from about 1 Kev up to 20 Mev. The code consists of the spherical optical model, the unified Hauser-Feshbach and exciton model. The man nal of the UNF code is available for users. The format of the input parameter files and the output files, as well as the functions of flag used in UNF code, are introduced in detail, and the examples of the format of input parameters files are given

  6. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  7. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  8. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  9. Evaluation of Code Blue Implementation Outcomes

    Directory of Open Access Journals (Sweden)

    Bengü Özütürk

    2015-09-01

    Full Text Available Aim: In this study, we aimed to emphasize the importance of Code Blue implementation and to determine deficiencies in this regard. Methods: After obtaining the ethics committee approval, 225 patient’s code blue call data between 2012 and 2014 January were retrospectively analyzed. Age and gender of the patients, date and time of the call and the clinics giving Code Blue, the time needed for the Code Blue team to arrive, the rates of false Code Blue calls, reasons for Code Blue calls and patient outcomes were investigated. Results: A total of 225 patients (149 male, 76 female were evaluated in the study. The mean age of the patients was 54.1 years. 142 (67.2% Code Blue calls occurred after hours and by emergency unit. The mean time for the Code Blue team to arrive was 1.10 minutes. Spontaneous circulation was provided in 137 patients (60.8%; 88 (39.1% died. The most commonly identified possible causes were of cardiac origin. Conclusion: This study showed that Code Blue implementation with a professional team within an efficient and targeted time increase the survival rate. Therefore, we conclude that the application of Code Blue carried out by a trained team is an essential standard in hospitals. (The Medical Bulletin of Haseki 2015; 53:204-8

  10. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...... machinery of algebraic geometry....

  11. New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2

    Science.gov (United States)

    Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin

    2017-03-01

    This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.

  12. Pump Component Model in SPACE Code

    International Nuclear Information System (INIS)

    Kim, Byoung Jae; Kim, Kyoung Doo

    2010-08-01

    This technical report describes the pump component model in SPACE code. A literature survey was made on pump models in existing system codes. The models embedded in SPACE code were examined to check the confliction with intellectual proprietary rights. Design specifications, computer coding implementation, and test results are included in this report

  13. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  14. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  15. User's manual for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the User's Manual for the TMAD code system, which includes the TMAD code and the LIBMAKR code. The TMAD code was commissioned to make it easier to interpret moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, also can be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code then will interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data

  16. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  17. The Lawrence Livermore National Laboratory Intelligent Actinide Analysis System

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Koenig, Z.M.

    1993-01-01

    The authors have developed an Intelligent Actinide Analysis System (IAAS) for Materials Management to use in the Plutonium Facility at the Lawrence Livermore National Laboratory. The IAAS will measure isotopic ratios for plutonium and other actinides non-destructively by high-resolution gamma-ray spectrometry. This system will measure samples in a variety of matrices and containers. It will provide automated control of many aspects of the instrument that previously required manual intervention and/or control. The IAAS is a second-generation instrument, based on experience in fielding gamma isotopic systems, that is intended to advance non-destructive actinide analysis for nuclear safeguards in performance, automation, ease of use, adaptability, systems integration and extensibility to robotics. It uses a client-server distributed monitoring and control architecture. The IAAS uses MGA as the isotopic analysis code. The design of the IAAS reduces the need for operator intervention, operator training, and operator exposure

  18. Differentially Encoded LDPC Codes—Part II: General Case and Code Optimization

    Directory of Open Access Journals (Sweden)

    Jing Li (Tiffany

    2008-04-01

    Full Text Available This two-part series of papers studies the theory and practice of differentially encoded low-density parity-check (DE-LDPC codes, especially in the context of noncoherent detection. Part I showed that a special class of DE-LDPC codes, product accumulate codes, perform very well with both coherent and noncoherent detections. The analysis here reveals that a conventional LDPC code, however, is not fitful for differential coding and does not, in general, deliver a desirable performance when detected noncoherently. Through extrinsic information transfer (EXIT analysis and a modified “convergence-constraint” density evolution (DE method developed here, we provide a characterization of the type of LDPC degree profiles that work in harmony with differential detection (or a recursive inner code in general, and demonstrate how to optimize these LDPC codes. The convergence-constraint method provides a useful extension to the conventional “threshold-constraint” method, and can match an outer LDPC code to any given inner code with the imperfectness of the inner decoder taken into consideration.

  19. GAMERA - The New Magnetospheric Code

    Science.gov (United States)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  20. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  1. Optimal Codes for the Burst Erasure Channel

    Science.gov (United States)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  2. Applications guide to the RSIC-distributed version of the MCNP code (coupled Monte Carlo neutron-photon Code)

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-09-01

    An overview of the RSIC-distributed version of the MCNP code (a soupled Monte Carlo neutron-photon code) is presented. All general features of the code, from machine hardware requirements to theoretical details, are discussed. The current nuclide cross-section and other libraries available in the standard code package are specified, and a realistic example of the flexible geometry input is given. Standard and nonstandard source, estimator, and variance-reduction procedures are outlined. Examples of correct usage and possible misuse of certain code features are presented graphically and in standard output listings. Finally, itemized summaries of sample problems, various MCNP code documentation, and future work are given

  3. The RETRAN-03 computer code

    International Nuclear Information System (INIS)

    Paulsen, M.P.; McFadden, J.H.; Peterson, C.E.; McClure, J.A.; Gose, G.C.; Jensen, P.J.

    1991-01-01

    The RETRAN-03 code development effort is designed to overcome the major theoretical and practical limitations associated with the RETRAN-02 computer code. The major objectives of the development program are to extend the range of analyses that can be performed with RETRAN, to make the code more dependable and faster running, and to have a more transportable code. The first two objectives are accomplished by developing new models and adding other models to the RETRAN-02 base code. The major model additions for RETRAN-03 are as follows: implicit solution methods for the steady-state and transient forms of the field equations; additional options for the velocity difference equation; a new steady-state initialization option for computer low-power steam generator initial conditions; models for nonequilibrium thermodynamic conditions; and several special-purpose models. The source code and the environmental library for RETRAN-03 are written in standard FORTRAN 77, which allows the last objective to be fulfilled. Some models in RETRAN-02 have been deleted in RETRAN-03. In this paper the changes between RETRAN-02 and RETRAN-03 are reviewed

  4. Zebra: An advanced PWR lattice code

    Energy Technology Data Exchange (ETDEWEB)

    Cao, L.; Wu, H.; Zheng, Y. [School of Nuclear Science and Technology, Xi' an Jiaotong Univ., No. 28, Xianning West Road, Xi' an, ShannXi, 710049 (China)

    2012-07-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  5. Code of Ethics for Electrical Engineers

    Science.gov (United States)

    Matsuki, Junya

    The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.

  6. Zebra: An advanced PWR lattice code

    International Nuclear Information System (INIS)

    Cao, L.; Wu, H.; Zheng, Y.

    2012-01-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  7. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  8. Lattice polytopes in coding theory

    Directory of Open Access Journals (Sweden)

    Ivan Soprunov

    2015-05-01

    Full Text Available In this paper we discuss combinatorial questions about lattice polytopes motivated by recent results on minimum distance estimation for toric codes. We also include a new inductive bound for the minimum distance of generalized toric codes. As an application, we give new formulas for the minimum distance of generalized toric codes for special lattice point configurations.

  9. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  10. New binary linear codes which are dual transforms of good codes

    NARCIS (Netherlands)

    Jaffe, D.B.; Simonis, J.

    1999-01-01

    If C is a binary linear code, one may choose a subset S of C, and form a new code CST which is the row space of the matrix having the elements of S as its columns. One way of picking S is to choose a subgroup H of Aut(C) and let S be some H-stable subset of C. Using (primarily) this method for

  11. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  12. Code Package to Analyze Parameters of the WWER Fuel Rod. TOPRA-2 Code - Verification Data

    International Nuclear Information System (INIS)

    Scheglov, A.; Proselkov, V.; Passage, G.; Stefanova, S.

    2009-01-01

    Presented are the data for computer codes to analyze WWER fuel rods, used in the WWER department of RRC 'Kurchatov Institute'. Presented is the description of TOPRA-2 code intended for the engineering analysis of thermophysical and strength parameters of the WWER fuel rod - temperature distributions along the fuel radius, gas pressures under the cladding, stresses in the cladding, etc. for the reactor operation in normal conditions. Presented are some results of the code verification against test problems and the data obtained in the experimental programs. Presented are comparison results of the calculations with TOPRA-2 and TRANSURANUS (V1M1J06) codes. Results obtained in the course of verification demonstrate possibility of application of the methodology and TOPRA-2 code for the engineering analysis of the WWER fuel rods

  13. Rate-Compatible Protograph LDPC Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.

  14. Why comply with a code of ethics?

    Science.gov (United States)

    Spielthenner, Georg

    2015-05-01

    A growing number of professional associations and occupational groups are creating codes of ethics with the goal of guiding their members, protecting service users, and safeguarding the reputation of the profession. There is a great deal of literature dealing with the question to what extent ethical codes can achieve their desired objectives. The present paper does not contribute to this debate. Its aim is rather to investigate how rational it is to comply with codes of conduct. It is natural and virtually inevitable for a reflective person to ask why one should pay any attention to ethical codes, in particular if following a code is not in one's own interest. In order to achieve the aim of this paper, I shall (in "Quasi-reasons for complying with an ethical code" section) discuss reasons that only appear to be reasons for complying with a code. In "Code-independent reasons" section, I shall present genuine practical reasons that, however, turn out to be reasons of the wrong kind. In "Code-dependent reasons" section finally presents the most important reasons for complying with ethical codes. The paper argues that while ethical codes do not necessarily yield reasons for action, professionals can have genuine reasons for complying with a code, which may, however, be rather weak and easily overridden by reasons for deviating from the code.

  15. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  16. Verification of reactor safety codes

    International Nuclear Information System (INIS)

    Murley, T.E.

    1978-01-01

    The safety evaluation of nuclear power plants requires the investigation of wide range of potential accidents that could be postulated to occur. Many of these accidents deal with phenomena that are outside the range of normal engineering experience. Because of the expense and difficulty of full scale tests covering the complete range of accident conditions, it is necessary to rely on complex computer codes to assess these accidents. The central role that computer codes play in safety analyses requires that the codes be verified, or tested, by comparing the code predictions with a wide range of experimental data chosen to span the physical phenomena expected under potential accident conditions. This paper discusses the plans of the Nuclear Regulatory Commission for verifying the reactor safety codes being developed by NRC to assess the safety of light water reactors and fast breeder reactors. (author)

  17. Tristan code and its application

    Science.gov (United States)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  18. Monomial codes seen as invariant subspaces

    Directory of Open Access Journals (Sweden)

    García-Planas María Isabel

    2017-08-01

    Full Text Available It is well known that cyclic codes are very useful because of their applications, since they are not computationally expensive and encoding can be easily implemented. The relationship between cyclic codes and invariant subspaces is also well known. In this paper a generalization of this relationship is presented between monomial codes over a finite field and hyperinvariant subspaces of n under an appropriate linear transformation. Using techniques of Linear Algebra it is possible to deduce certain properties for this particular type of codes, generalizing known results on cyclic codes.

  19. TRAC code development status and plans

    International Nuclear Information System (INIS)

    Spore, J.W.; Liles, D.R.; Nelson, R.A.

    1986-01-01

    This report summarizes the characteristics and current status of the TRAC-PF1/MOD1 computer code. Recent error corrections and user-convenience features are described, and several user enhancements are identified. Current plans for the release of the TRAC-PF1/MOD2 computer code and some preliminary MOD2 results are presented. This new version of the TRAC code implements stability-enhancing two-step numerics into the 3-D vessel, using partial vectorization to obtain a code that has run 400% faster than the MOD1 code

  20. Computer code ANISN multiplying media and shielding calculation 2. Code description (input/output)

    International Nuclear Information System (INIS)

    Maiorino, J.R.

    1991-01-01

    The new code CCC-0514-ANISN/PC is described, as well as a ''GENERAL DESCRIPTION OF ANISN/PC code''. In addition to the ANISN/PC code, the transmittal package includes an interactive input generation programme called APE (ANISN Processor and Evaluator), which facilitates the work of the user in giving input. Also, a 21 group photon cross section master library FLUNGP.LIB in ISOTX format, which can be edited by an executable file LMOD.EXE, is included in the package. The input and output subroutines are reviewed. 6 refs, 1 fig., 1 tab

  1. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  2. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  3. Coded communications with nonideal interleaving

    Science.gov (United States)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  4. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  5. Computer codes in particle transport physics

    International Nuclear Information System (INIS)

    Pesic, M.

    2004-01-01

    Simulation of transport and interaction of various particles in complex media and wide energy range (from 1 MeV up to 1 TeV) is very complicated problem that requires valid model of a real process in nature and appropriate solving tool - computer code and data library. A brief overview of computer codes based on Monte Carlo techniques for simulation of transport and interaction of hadrons and ions in wide energy range in three dimensional (3D) geometry is shown. Firstly, a short attention is paid to underline the approach to the solution of the problem - process in nature - by selection of the appropriate 3D model and corresponding tools - computer codes and cross sections data libraries. Process of data collection and evaluation from experimental measurements and theoretical approach to establishing reliable libraries of evaluated cross sections data is Ion g, difficult and not straightforward activity. For this reason, world reference data centers and specialized ones are acknowledged, together with the currently available, state of art evaluated nuclear data libraries, as the ENDF/B-VI, JEF, JENDL, CENDL, BROND, etc. Codes for experimental and theoretical data evaluations (e.g., SAMMY and GNASH) together with the codes for data processing (e.g., NJOY, PREPRO and GRUCON) are briefly described. Examples of data evaluation and data processing to generate computer usable data libraries are shown. Among numerous and various computer codes developed in transport physics of particles, the most general ones are described only: MCNPX, FLUKA and SHIELD. A short overview of basic application of these codes, physical models implemented with their limitations, energy ranges of particles and types of interactions, is given. General information about the codes covers also programming language, operation system, calculation speed and the code availability. An example of increasing computation speed of running MCNPX code using a MPI cluster compared to the code sequential option

  6. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  7. Tribal Green Building Administrative Code Example

    Science.gov (United States)

    This Tribal Green Building Administrative Code Example can be used as a template for technical code selection (i.e., building, electrical, plumbing, etc.) to be adopted as a comprehensive building code.

  8. NOAA Weather Radio - EAS Event Codes

    Science.gov (United States)

    Non-Zero All Hazards Logo Emergency Alert Description Event Codes Fact Sheet FAQ Organization Search Coding Using SAME SAME Non-Zero Codes DOCUMENTS NWR Poster NWR Brochure NWR Brochure Printing Notes

  9. Some new quasi-twisted ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2015-09-01

    Full Text Available Let [n, k, d]_q code be a linear code of length n, dimension k and minimum Hamming distance d over GF(q. One of the basic and most important problems in coding theory is to construct codes with best possible minimum distances. In this paper seven quasi-twisted ternary linear codes are constructed. These codes are new and improve the best known lower bounds on the minimum distance in [6].

  10. Vectorization of nuclear codes 90-1

    International Nuclear Information System (INIS)

    Nonomiya, Iwao; Nemoto, Toshiyuki; Ishiguro, Misako; Harada, Hiroo; Hori, Takeo.

    1990-09-01

    The vectorization has been made for four codes: SONATINA-2V HTTR version, TRIDOSE, VIENUS, and SCRYU. SONATINA-2V HTTR version is a code for analyzing the dynamic behavior of fuel blocks in the vertical slice of the HTGR (High Temperature Gas-cooled Reactor) core under seismic perturbation, TRIDOSE is a code for calculating environmental tritium concentration and dose, VIENUS is a code for analyzing visco elastic stress of the fuel block of HTTR (High Temperature gas-cooled Test Reactor), and SCRYU is a thermal-hydraulics code with boundary fitted coordinate system. The total speedup ratio of the vectorized versions to the original scalar ones is 5.2 for SONATINA-2V HTTR version. 5.9 ∼ 6.9 for TRIDOSE, 6.7 for VIENUS, 7.6 for SCRYU, respectively. In this report, we describe outline of codes, techniques used for the vectorization, verification of computed results, and speedup effect on the vectorized codes. (author)

  11. Computer Security: is your code sane?

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2015-01-01

    How many of us write code? Software? Programs? Scripts? How many of us are properly trained in this and how well do we do it? Do we write functional, clean and correct code, without flaws, bugs and vulnerabilities*? In other words: are our codes sane?   Figuring out weaknesses is not that easy (see our quiz in an earlier Bulletin article). Therefore, in order to improve the sanity of your code, prevent common pit-falls, and avoid the bugs and vulnerabilities that can crash your code, or – worse – that can be misused and exploited by attackers, the CERN Computer Security team has reviewed its recommendations for checking the security compliance of your code. “Static Code Analysers” are stand-alone programs that can be run on top of your software stack, regardless of whether it uses Java, C/C++, Perl, PHP, Python, etc. These analysers identify weaknesses and inconsistencies including: employing undeclared variables; expressions resu...

  12. Short-Block Protograph-Based LDPC Codes

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.

  13. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  14. Rulemaking efforts on codes and standards

    International Nuclear Information System (INIS)

    Millman, G.C.

    1992-01-01

    Section 50.55a of the NRC regulations provides a mechanism for incorporating national codes and standards into the regulatory process. It incorporates by reference ASME Boiler and Pressure Vessel Code (ASME B and PV Code) Section 3 rules for construction and Section 11 rules for inservice inspection and inservice testing. The regulation is periodically amended to update these references. The rulemaking process, as applied to Section 50.55a amendments, is overviewed to familiarize users with associated internal activities of the NRC staff and the manner in which public comments are integrated into the process. The four ongoing rulemaking actions that would individually amend Section 50.55a are summarized. Two of the actions would directly impact requirements for inservice testing. Benefits accrued with NRC endorsement of the ASME B and PV Code, and possible future endorsement of the ASME Operations and Maintenance Code (ASME OM Code), are identified. Emphasis is placed on the need for code writing committees to be especially sensitive to user feedback on code rules incorporated into the regulatory process to ensure that the rules are complete, technically accurate, clear, practical, and enforceable

  15. Multiplexed coding in the human basal ganglia

    Science.gov (United States)

    Andres, D. S.; Cerquetti, D.; Merello, M.

    2016-04-01

    A classic controversy in neuroscience is whether information carried by spike trains is encoded by a time averaged measure (e.g. a rate code), or by complex time patterns (i.e. a time code). Here we apply a tool to quantitatively analyze the neural code. We make use of an algorithm based on the calculation of the temporal structure function, which permits to distinguish what scales of a signal are dominated by a complex temporal organization or a randomly generated process. In terms of the neural code, this kind of analysis makes it possible to detect temporal scales at which a time patterns coding scheme or alternatively a rate code are present. Additionally, finding the temporal scale at which the correlation between interspike intervals fades, the length of the basic information unit of the code can be established, and hence the word length of the code can be found. We apply this algorithm to neuronal recordings obtained from the Globus Pallidus pars interna from a human patient with Parkinson’s disease, and show that a time pattern coding and a rate coding scheme co-exist at different temporal scales, offering a new example of multiplexed neuronal coding.

  16. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    Science.gov (United States)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  17. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    Science.gov (United States)

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  18. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  19. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    Science.gov (United States)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  20. Comparison of design margin for core shroud in between design and construction code and fitness-for-service code

    International Nuclear Information System (INIS)

    Dozaki, Koji

    2007-01-01

    Structural design methods for core shroud of BWR are specified in JSME Design and Construction Code, like ASME Boiler and Pressure Vessel Code Sec. III, as a part of core support structure. Design margins are defined according to combination of the structural design method selected and service limit considered. Basically, those margins in JSME Code were determined after ASME Sec. III. Designers can select so-called twice-slope method for core shroud design among those design methods. On the other hand, flaw evaluation rules have been established for core shroud in JSME Fitness-for-Service Code. Twice-slope method is also adopted for fracture evaluation in that code even when the core shroud contains a flaw. Design margin was determined as structural factors separately from Design and Construction Code. As a natural consequence, there is a difference in those design margins between the two codes. In this paper, it is shown that the design margin in Fitness-for-Service Code is conservative by experimental evidences. Comparison of design margins between the two codes is discussed. (author)