WorldWideScience

Sample records for highly conserved coding

  1. Highly conserved non-coding sequences are associated with vertebrate development.

    Directory of Open Access Journals (Sweden)

    Adam Woolfe

    2005-01-01

    Full Text Available In addition to protein coding sequence, the human genome contains a significant amount of regulatory DNA, the identification of which is proving somewhat recalcitrant to both in silico and functional methods. An approach that has been used with some success is comparative sequence analysis, whereby equivalent genomic regions from different organisms are compared in order to identify both similarities and differences. In general, similarities in sequence between highly divergent organisms imply functional constraint. We have used a whole-genome comparison between humans and the pufferfish, Fugu rubripes, to identify nearly 1,400 highly conserved non-coding sequences. Given the evolutionary divergence between these species, it is likely that these sequences are found in, and furthermore are essential to, all vertebrates. Most, and possibly all, of these sequences are located in and around genes that act as developmental regulators. Some of these sequences are over 90% identical across more than 500 bases, being more highly conserved than coding sequence between these two species. Despite this, we cannot find any similar sequences in invertebrate genomes. In order to begin to functionally test this set of sequences, we have used a rapid in vivo assay system using zebrafish embryos that allows tissue-specific enhancer activity to be identified. Functional data is presented for highly conserved non-coding sequences associated with four unrelated developmental regulators (SOX21, PAX6, HLXB9, and SHH, in order to demonstrate the suitability of this screen to a wide range of genes and expression patterns. Of 25 sequence elements tested around these four genes, 23 show significant enhancer activity in one or more tissues. We have identified a set of non-coding sequences that are highly conserved throughout vertebrates. They are found in clusters across the human genome, principally around genes that are implicated in the regulation of development

  2. Building Standards and Codes for Energy Conservation

    Science.gov (United States)

    Gross, James G.; Pierlert, James H.

    1977-01-01

    Current activity intended to lead to energy conservation measures in building codes and standards is reviewed by members of the Office of Building Standards and Codes Services of the National Bureau of Standards. For journal availability see HE 508 931. (LBH)

  3. Conservation of concrete structures according to fib Model Code 2010

    NARCIS (Netherlands)

    Matthews, S.; Bigaj-Van Vliet, A.; Ueda, T.

    2013-01-01

    Conservation of concrete structures forms an essential part of the fib Model Code for Concrete Structures 2010 (fib Model Code 2010). In particular, Chapter 9 of fib Model Code 2010 addresses issues concerning conservation strategies and tactics, conservation management, condition surveys, condition

  4. Highly conserved non-coding elements on either side of SOX9 associated with Pierre Robin sequence.

    Science.gov (United States)

    Benko, Sabina; Fantes, Judy A; Amiel, Jeanne; Kleinjan, Dirk-Jan; Thomas, Sophie; Ramsay, Jacqueline; Jamshidi, Negar; Essafi, Abdelkader; Heaney, Simon; Gordon, Christopher T; McBride, David; Golzio, Christelle; Fisher, Malcolm; Perry, Paul; Abadie, Véronique; Ayuso, Carmen; Holder-Espinasse, Muriel; Kilpatrick, Nicky; Lees, Melissa M; Picard, Arnaud; Temple, I Karen; Thomas, Paul; Vazquez, Marie-Paule; Vekemans, Michel; Roest Crollius, Hugues; Hastie, Nicholas D; Munnich, Arnold; Etchevers, Heather C; Pelet, Anna; Farlie, Peter G; Fitzpatrick, David R; Lyonnet, Stanislas

    2009-03-01

    Pierre Robin sequence (PRS) is an important subgroup of cleft palate. We report several lines of evidence for the existence of a 17q24 locus underlying PRS, including linkage analysis results, a clustering of translocation breakpoints 1.06-1.23 Mb upstream of SOX9, and microdeletions both approximately 1.5 Mb centromeric and approximately 1.5 Mb telomeric of SOX9. We have also identified a heterozygous point mutation in an evolutionarily conserved region of DNA with in vitro and in vivo features of a developmental enhancer. This enhancer is centromeric to the breakpoint cluster and maps within one of the microdeletion regions. The mutation abrogates the in vitro enhancer function and alters binding of the transcription factor MSX1 as compared to the wild-type sequence. In the developing mouse mandible, the 3-Mb region bounded by the microdeletions shows a regionally specific chromatin decompaction in cells expressing Sox9. Some cases of PRS may thus result from developmental misexpression of SOX9 due to disruption of very-long-range cis-regulatory elements.

  5. A Very Fast and Angular Momentum Conserving Tree Code

    International Nuclear Information System (INIS)

    Marcello, Dominic C.

    2017-01-01

    There are many methods used to compute the classical gravitational field in astrophysical simulation codes. With the exception of the typically impractical method of direct computation, none ensure conservation of angular momentum to machine precision. Under uniform time-stepping, the Cartesian fast multipole method of Dehnen (also known as the very fast tree code) conserves linear momentum to machine precision. We show that it is possible to modify this method in a way that conserves both angular and linear momenta.

  6. A Very Fast and Angular Momentum Conserving Tree Code

    Energy Technology Data Exchange (ETDEWEB)

    Marcello, Dominic C., E-mail: dmarce504@gmail.com [Department of Physics and Astronomy, and Center for Computation and Technology Louisiana State University, Baton Rouge, LA 70803 (United States)

    2017-09-01

    There are many methods used to compute the classical gravitational field in astrophysical simulation codes. With the exception of the typically impractical method of direct computation, none ensure conservation of angular momentum to machine precision. Under uniform time-stepping, the Cartesian fast multipole method of Dehnen (also known as the very fast tree code) conserves linear momentum to machine precision. We show that it is possible to modify this method in a way that conserves both angular and linear momenta.

  7. High Energy Transport Code HETC

    International Nuclear Information System (INIS)

    Gabriel, T.A.

    1985-09-01

    The physics contained in the High Energy Transport Code (HETC), in particular the collision models, are discussed. An application using HETC as part of the CALOR code system is also given. 19 refs., 5 figs., 3 tabs

  8. Genome-wide identification of coding and non-coding conserved sequence tags in human and mouse genomes

    Directory of Open Access Journals (Sweden)

    Maggi Giorgio P

    2008-06-01

    Full Text Available Abstract Background The accurate detection of genes and the identification of functional regions is still an open issue in the annotation of genomic sequences. This problem affects new genomes but also those of very well studied organisms such as human and mouse where, despite the great efforts, the inventory of genes and regulatory regions is far from complete. Comparative genomics is an effective approach to address this problem. Unfortunately it is limited by the computational requirements needed to perform genome-wide comparisons and by the problem of discriminating between conserved coding and non-coding sequences. This discrimination is often based (thus dependent on the availability of annotated proteins. Results In this paper we present the results of a comprehensive comparison of human and mouse genomes performed with a new high throughput grid-based system which allows the rapid detection of conserved sequences and accurate assessment of their coding potential. By detecting clusters of coding conserved sequences the system is also suitable to accurately identify potential gene loci. Following this analysis we created a collection of human-mouse conserved sequence tags and carefully compared our results to reliable annotations in order to benchmark the reliability of our classifications. Strikingly we were able to detect several potential gene loci supported by EST sequences but not corresponding to as yet annotated genes. Conclusion Here we present a new system which allows comprehensive comparison of genomes to detect conserved coding and non-coding sequences and the identification of potential gene loci. Our system does not require the availability of any annotated sequence thus is suitable for the analysis of new or poorly annotated genomes.

  9. Properties of Sequence Conservation in Upstream Regulatory and Protein Coding Sequences among Paralogs in Arabidopsis thaliana

    Science.gov (United States)

    Richardson, Dale N.; Wiehe, Thomas

    Whole genome duplication (WGD) has catalyzed the formation of new species, genes with novel functions, altered expression patterns, complexified signaling pathways and has provided organisms a level of genetic robustness. We studied the long-term evolution and interrelationships of 5’ upstream regulatory sequences (URSs), protein coding sequences (CDSs) and expression correlations (EC) of duplicated gene pairs in Arabidopsis. Three distinct methods revealed significant evolutionary conservation between paralogous URSs and were highly correlated with microarray-based expression correlation of the respective gene pairs. Positional information on exact matches between sequences unveiled the contribution of micro-chromosomal rearrangements on expression divergence. A three-way rank analysis of URS similarity, CDS divergence and EC uncovered specific gene functional biases. Transcription factor activity was associated with gene pairs exhibiting conserved URSs and divergent CDSs, whereas a broad array of metabolic enzymes was found to be associated with gene pairs showing diverged URSs but conserved CDSs.

  10. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  11. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  12. High dynamic range coding imaging system

    Science.gov (United States)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  13. Non-coding RNAs enter mitosis: functions, conservation and implications

    OpenAIRE

    Pek, Jun Wei; Kai, Toshie

    2011-01-01

    Abstract Nuage (or commonly known as chromatoid body in mammals) is a conserved germline-specific organelle that has been linked to the Piwi-interacting RNA (piRNA) pathway. piRNAs are a class of gonadal-specific RNAs that are ~23-29 nucleotides in length and protect genome stability by repressing the expression of deleterious retrotransposons. More recent studies in Drosophila have implicated the piRNA pathway in other functions including canalization of embryonic development, regulation of ...

  14. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  15. Guide to the Changes between the 2009 and 2012 International Energy Conservation Code

    Energy Technology Data Exchange (ETDEWEB)

    Mapes, Terry S.; Conover, David R.

    2012-05-31

    The International Code Council (ICC) published the 2012 International Energy Conservation Code{reg_sign} (IECC) in early 2012. The 2012 IECC is based on revisions, additions, and deletions to the 2009 IECC that were considered during the ICC code development process conducted in 2011. Solid vertical lines, arrows, or asterisks printed in the 2012 IECC indicate where revisions, deletions, or relocations of text respectively were made to 2009 IECC. Although these marginal markings indicate where changes have been made to the code, they do not provide any further guidance, leaving the reader to consult and compare the 2009 and 2012 IECC for more detail.

  16. Non-coding RNAs enter mitosis: functions, conservation and implications

    Directory of Open Access Journals (Sweden)

    Kai Toshie

    2011-02-01

    Full Text Available Abstract Nuage (or commonly known as chromatoid body in mammals is a conserved germline-specific organelle that has been linked to the Piwi-interacting RNA (piRNA pathway. piRNAs are a class of gonadal-specific RNAs that are ~23-29 nucleotides in length and protect genome stability by repressing the expression of deleterious retrotransposons. More recent studies in Drosophila have implicated the piRNA pathway in other functions including canalization of embryonic development, regulation of maternal gene expression and telomere protection. We have recently shown that Vasa (known as Mouse Vasa Homolog in mouse, a nuage component, plays a mitotic role in promoting chromosome condensation and segregation by facilitating robust chromosomal localization of condensin I in the Drosophila germline. Vasa functions together with Aubergine (a PIWI family protein and Spindle-E/mouse TDRD-9, two other nuage components that are involved in the piRNA pathway, therefore providing a link between the piRNA pathway and mitotic chromosome condensation. Here, we propose and discuss possible models for the role of Vasa and the piRNA pathway during mitosis. We also highlight relevant studies implicating mitotic roles for RNAs and/or nuage in other model systems and their implications for cancer development.

  17. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  18. Computer code validation by high temperature chemistry

    International Nuclear Information System (INIS)

    Alexander, C.A.; Ogden, J.S.

    1988-01-01

    At least five of the computer codes utilized in analysis of severe fuel damage-type events are directly dependent upon or can be verified by high temperature chemistry. These codes are ORIGEN, CORSOR, CORCON, VICTORIA, and VANESA. With the exemption of CORCON and VANESA, it is necessary that verification experiments be performed on real irradiated fuel. For ORIGEN, the familiar knudsen effusion cell is the best choice and a small piece of known mass and known burn-up is selected and volatilized completely into the mass spectrometer. The mass spectrometer is used in the integral mode to integrate the entire signal from preselected radionuclides, and from this integrated signal the total mass of the respective nuclides can be determined. For CORSOR and VICTORIA, experiments with flowing high pressure hydrogen/steam must flow over the irradiated fuel and then enter the mass spectrometer. For these experiments, a high pressure-high temperature molecular beam inlet must be employed. Finally, in support of VANESA-CORCON, the very highest temperature and molten fuels must be contained and analyzed. Results from all types of experiments will be discussed and their applicability to present and future code development will also be covered

  19. Comparison of energy conservation building codes of Iran, Turkey, Germany, China, ISO 9164 and EN 832

    International Nuclear Information System (INIS)

    Fayaz, Rima; Kari, Behrouz M.

    2009-01-01

    To improve the energy efficiency of buildings via compliance to regulation in Iran, Code No. 19 was devised in 1991. The code lacks high level aims and objectives, addressing the characteristics of Iranian buildings. As a consequence, the code has been revised and is not completely implemented in practice, and still remains inefficient. As with any energy coding system, this code has to identify the right balance between the different energy variables for the Iranian climate and way of life. In order to assist improvements to high level objectives of Code 19, this code is compared with ISO 9164, EN 832, German regulation, TS 825 of Turkey and China's GB 50189 to understand how these have adapted international standards to national features. In order to test the appropriateness of Code 19, five case study buildings in Iran are assessed against Code 19 as well as Turkish standard TS 825 and the results are compared. The results demonstrate that Code 19 is efficient in calculations of building envelope, but it needs improvements in the areas of ventilation, gains from internal and solar sources. The paper concludes by offering suggestions for improving the code.

  20. Effects of using coding potential, sequence conservation and mRNA structure conservation for predicting pyrroly-sine containing genes

    DEFF Research Database (Denmark)

    Have, Christian Theil; Zambach, Sine; Christiansen, Henning

    2013-01-01

    for prediction of pyrrolysine incorporating genes in genomes of bacteria and archaea leading to insights about the factors driving pyrrolysine translation and identification of new gene candidates. The method predicts known conserved genes with high recall and predicts several other promising candidates...... for experimental verification. The method is implemented as a computational pipeline which is available on request....

  1. Applicability evaluation on the conservative metal-water reaction(MWR) model implemented into the SPACE code

    International Nuclear Information System (INIS)

    Lee, Suk Ho; You, Sung Chang; Kim, Han Gon

    2011-01-01

    The SBLOCA (Small Break Loss-of-Coolant Accident) evaluation methodology for the APR1400 (Advanced Power Reactor 1400) is under development using the SPACE code. The goal of the development of this methodology is to set up a conservative evaluation methodology in accordance with Appendix K of 10CFR50 by the end of 2012. In order to develop the Appendix K version of the SPACE code, the code modification is considered through implementation of the code on the required evaluation models. For the conservative models required in the SPACE code, the metal-water reaction (MWR) model, the critical flow model, the Critical Heat Flux (CHF) model and the post-CHF model must be implemented in the code. At present, the integration of the model to generate the Appendix K version of SPACE is in its preliminary stage. Among them, the conservative MWR model and its code applicability are introduced in this paper

  2. High burnup models in computer code fair

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, B K; Swami Prasad, P; Kushwaha, H S; Mahajan, S C; Kakodar, A [Bhabha Atomic Research Centre, Bombay (India)

    1997-08-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ``Light water reactor fuel rod modelling code evaluation`` and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs.

  3. High burnup models in computer code fair

    International Nuclear Information System (INIS)

    Dutta, B.K.; Swami Prasad, P.; Kushwaha, H.S.; Mahajan, S.C.; Kakodar, A.

    1997-01-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ''Light water reactor fuel rod modelling code evaluation'' and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs

  4. Redefining Secondary Forests in the Mexican Forest Code: Implications for Management, Restoration, and Conservation

    Directory of Open Access Journals (Sweden)

    Francisco J. Román-Dañobeytia

    2014-05-01

    Full Text Available The Mexican Forest Code establishes structural reference values to differentiate between secondary and old-growth forests and requires a management plan when secondary forests become old-growth and potentially harvestable forests. The implications of this regulation for forest management, restoration, and conservation were assessed in the context of the Calakmul Biosphere Reserve, which is located in the Yucatan Peninsula. The basal area and stem density thresholds currently used by the legislation to differentiate old-growth from secondary forests are 4 m2/ha and 15 trees/ha (trees with a diameter at breast height of >25 cm; however, our research indicates that these values should be increased to 20 m2/ha and 100 trees/ha, respectively. Given that a management plan is required when secondary forests become old-growth forests, many landowners avoid forest-stand development by engaging slash-and-burn agriculture or cattle grazing. We present evidence that deforestation and land degradation may prevent the natural regeneration of late-successional tree species of high ecological and economic importance. Moreover, we discuss the results of this study in the light of an ongoing debate in the Yucatan Peninsula between policy makers, non-governmental organizations (NGOs, landowners and researchers, regarding the modification of this regulation to redefine the concept of acahual (secondary forest and to facilitate forest management and restoration with valuable timber tree species.

  5. Conserved syntenic clusters of protein coding genes are missing in birds.

    Science.gov (United States)

    Lovell, Peter V; Wirthlin, Morgan; Wilhelm, Larry; Minx, Patrick; Lazar, Nathan H; Carbone, Lucia; Warren, Wesley C; Mello, Claudio V

    2014-01-01

    Birds are one of the most highly successful and diverse groups of vertebrates, having evolved a number of distinct characteristics, including feathers and wings, a sturdy lightweight skeleton and unique respiratory and urinary/excretion systems. However, the genetic basis of these traits is poorly understood. Using comparative genomics based on extensive searches of 60 avian genomes, we have found that birds lack approximately 274 protein coding genes that are present in the genomes of most vertebrate lineages and are for the most part organized in conserved syntenic clusters in non-avian sauropsids and in humans. These genes are located in regions associated with chromosomal rearrangements, and are largely present in crocodiles, suggesting that their loss occurred subsequent to the split of dinosaurs/birds from crocodilians. Many of these genes are associated with lethality in rodents, human genetic disorders, or biological functions targeting various tissues. Functional enrichment analysis combined with orthogroup analysis and paralog searches revealed enrichments that were shared by non-avian species, present only in birds, or shared between all species. Together these results provide a clearer definition of the genetic background of extant birds, extend the findings of previous studies on missing avian genes, and provide clues about molecular events that shaped avian evolution. They also have implications for fields that largely benefit from avian studies, including development, immune system, oncogenesis, and brain function and cognition. With regards to the missing genes, birds can be considered ‘natural knockouts’ that may become invaluable model organisms for several human diseases.

  6. Ultra-high resolution coded wavefront sensor

    KAUST Repository

    Wang, Congli

    2017-06-08

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  7. Evolutionary growth process of highly conserved sequences in vertebrate genomes.

    Science.gov (United States)

    Ishibashi, Minaka; Noda, Akiko Ogura; Sakate, Ryuichi; Imanishi, Tadashi

    2012-08-01

    Genome sequence comparison between evolutionarily distant species revealed ultraconserved elements (UCEs) among mammals under strong purifying selection. Most of them were also conserved among vertebrates. Because they tend to be located in the flanking regions of developmental genes, they would have fundamental roles in creating vertebrate body plans. However, the evolutionary origin and selection mechanism of these UCEs remain unclear. Here we report that UCEs arose in primitive vertebrates, and gradually grew in vertebrate evolution. We searched for UCEs in two teleost fishes, Tetraodon nigroviridis and Oryzias latipes, and found 554 UCEs with 100% identity over 100 bps. Comparison of teleost and mammalian UCEs revealed 43 pairs of common, jawed-vertebrate UCEs (jUCE) with high sequence identities, ranging from 83.1% to 99.2%. Ten of them retain lower similarities to the Petromyzon marinus genome, and the substitution rates of four non-exonic jUCEs were reduced after the teleost-mammal divergence, suggesting that robust conservation had been acquired in the jawed vertebrate lineage. Our results indicate that prototypical UCEs originated before the divergence of jawed and jawless vertebrates and have been frozen as perfect conserved sequences in the jawed vertebrate lineage. In addition, our comparative sequence analyses of UCEs and neighboring regions resulted in a discovery of lineage-specific conserved sequences. They were added progressively to prototypical UCEs, suggesting step-wise acquisition of novel regulatory roles. Our results indicate that conserved non-coding elements (CNEs) consist of blocks with distinct evolutionary history, each having been frozen since different evolutionary era along the vertebrate lineage. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Identification of evolutionarily conserved non-AUG-initiated N-terminal extensions in human coding sequences.

    LENUS (Irish Health Repository)

    Ivanov, Ivaylo P

    2011-05-01

    In eukaryotes, it is generally assumed that translation initiation occurs at the AUG codon closest to the messenger RNA 5\\' cap. However, in certain cases, initiation can occur at codons differing from AUG by a single nucleotide, especially the codons CUG, UUG, GUG, ACG, AUA and AUU. While non-AUG initiation has been experimentally verified for a handful of human genes, the full extent to which this phenomenon is utilized--both for increased coding capacity and potentially also for novel regulatory mechanisms--remains unclear. To address this issue, and hence to improve the quality of existing coding sequence annotations, we developed a methodology based on phylogenetic analysis of predicted 5\\' untranslated regions from orthologous genes. We use evolutionary signatures of protein-coding sequences as an indicator of translation initiation upstream of annotated coding sequences. Our search identified novel conserved potential non-AUG-initiated N-terminal extensions in 42 human genes including VANGL2, FGFR1, KCNN4, TRPV6, HDGF, CITED2, EIF4G3 and NTF3, and also affirmed the conservation of known non-AUG-initiated extensions in 17 other genes. In several instances, we have been able to obtain independent experimental evidence of the expression of non-AUG-initiated products from the previously published literature and ribosome profiling data.

  9. Long non-coding RNA discovery across the genus anopheles reveals conserved secondary structures within and beyond the Gambiae complex.

    Science.gov (United States)

    Jenkins, Adam M; Waterhouse, Robert M; Muskavitch, Marc A T

    2015-04-23

    Long non-coding RNAs (lncRNAs) have been defined as mRNA-like transcripts longer than 200 nucleotides that lack significant protein-coding potential, and many of them constitute scaffolds for ribonucleoprotein complexes with critical roles in epigenetic regulation. Various lncRNAs have been implicated in the modulation of chromatin structure, transcriptional and post-transcriptional gene regulation, and regulation of genomic stability in mammals, Caenorhabditis elegans, and Drosophila melanogaster. The purpose of this study is to identify the lncRNA landscape in the malaria vector An. gambiae and assess the evolutionary conservation of lncRNAs and their secondary structures across the Anopheles genus. Using deep RNA sequencing of multiple Anopheles gambiae life stages, we have identified 2,949 lncRNAs and more than 300 previously unannotated putative protein-coding genes. The lncRNAs exhibit differential expression profiles across life stages and adult genders. We find that across the genus Anopheles, lncRNAs display much lower sequence conservation than protein-coding genes. Additionally, we find that lncRNA secondary structure is highly conserved within the Gambiae complex, but diverges rapidly across the rest of the genus Anopheles. This study offers one of the first lncRNA secondary structure analyses in vector insects. Our description of lncRNAs in An. gambiae offers the most comprehensive genome-wide insights to date into lncRNAs in this vector mosquito, and defines a set of potential targets for the development of vector-based interventions that may further curb the human malaria burden in disease-endemic countries.

  10. Conservation and losses of non-coding RNAs in avian genomes.

    Directory of Open Access Journals (Sweden)

    Paul P Gardner

    Full Text Available Here we present the results of a large-scale bioinformatics annotation of non-coding RNA loci in 48 avian genomes. Our approach uses probabilistic models of hand-curated families from the Rfam database to infer conserved RNA families within each avian genome. We supplement these annotations with predictions from the tRNA annotation tool, tRNAscan-SE and microRNAs from miRBase. We identify 34 lncRNA-associated loci that are conserved between birds and mammals and validate 12 of these in chicken. We report several intriguing cases where a reported mammalian lncRNA, but not its function, is conserved. We also demonstrate extensive conservation of classical ncRNAs (e.g., tRNAs and more recently discovered ncRNAs (e.g., snoRNAs and miRNAs in birds. Furthermore, we describe numerous "losses" of several RNA families, and attribute these to either genuine loss, divergence or missing data. In particular, we show that many of these losses are due to the challenges associated with assembling avian microchromosomes. These combined results illustrate the utility of applying homology-based methods for annotating novel vertebrate genomes.

  11. An evolutionary model for protein-coding regions with conserved RNA structure

    DEFF Research Database (Denmark)

    Pedersen, Jakob Skou; Forsberg, Roald; Meyer, Irmtraud Margret

    2004-01-01

    in the RNA structure. The overlap of these fundamental dependencies is sufficient to cause "contagious" context dependencies which cascade across many nucleotide sites. Such large-scale dependencies challenge the use of traditional phylogenetic models in evolutionary inference because they explicitly assume...... components of traditional phylogenetic models. We applied this to a data set of full-genome sequences from the hepatitis C virus where five RNA structures are mapped within the coding region. This allowed us to partition the effects of selection on different structural elements and to test various hypotheses......Here we present a model of nucleotide substitution in protein-coding regions that also encode the formation of conserved RNA structures. In such regions, apparent evolutionary context dependencies exist, both between nucleotides occupying the same codon and between nucleotides forming a base pair...

  12. Energy conservation: policy issues and end-use scenarios of savings potential. Part V. Energy efficient buildings: the causes of litigation against energy conservation building codes

    Energy Technology Data Exchange (ETDEWEB)

    Benenson, P.; Codina, R.; Cornwall, B.

    1978-09-01

    The guidelines laid out for the five subjects investigated in this series are to take a holistic view of energy conservation policies by describing the overall system in which they are implemented; provide analytical tools and sufficiently disagregated data bases that can be adapted to answer a variety of questions by the users; identify and discuss some of the important issues behind successful energy conservation policy; and develop an energy conservation policy in depth. Three specific cases reviewed are: the California nonresidential code (1976); the California residential code (1978); and the Farmers Home Administration code (1978). Although these three suits were brought by the building industry, this report also discusses considerations relevant to architects, bankers, and building inspectors. These cases are discussed from three perspectives: (1) objections to the codes explicitly stated in court, (2) industry conditions and practices behind objections stated in court, and (3) general beliefs not stated in court. This discussion focuses on suits intended to limit those building codes which the building industry sees as too strong. However, some energy conservation industries may sue to strengthen codes which they consider too weak. An example of such a case is Polarized Corporation's current suit against the Lighting section of ASHRAE 90-75 (Los Angeles Federal District Court, see Murnane, 1978). (MCW)

  13. Fuel analysis code FAIR and its high burnup modelling capabilities

    International Nuclear Information System (INIS)

    Prasad, P.S.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1995-01-01

    A computer code FAIR has been developed for analysing performance of water cooled reactor fuel pins. It is capable of analysing high burnup fuels. This code has recently been used for analysing ten high burnup fuel rods irradiated at Halden reactor. In the present paper, the code FAIR and its various high burnup models are described. The performance of code FAIR in analysing high burnup fuels and its other applications are highlighted. (author). 21 refs., 12 figs

  14. Complexity-aware high efficiency video coding

    CERN Document Server

    Correa, Guilherme; Agostini, Luciano; Cruz, Luis A da Silva

    2016-01-01

    This book discusses computational complexity of High Efficiency Video Coding (HEVC) encoders with coverage extending from the analysis of HEVC compression efficiency and computational complexity to the reduction and scaling of its encoding complexity. After an introduction to the topic and a review of the state-of-the-art research in the field, the authors provide a detailed analysis of the HEVC encoding tools compression efficiency and computational complexity.  Readers will benefit from a set of algorithms for scaling the computational complexity of HEVC encoders, all of which take advantage from the flexibility of the frame partitioning structures allowed by the standard.  The authors also provide a set of early termination methods based on data mining and machine learning techniques, which are able to reduce the computational complexity required to find the best frame partitioning structures. The applicability of the proposed methods is finally exemplified with an encoding time control system that emplo...

  15. High resolution tomography using analog coding

    International Nuclear Information System (INIS)

    Brownell, G.L.; Burnham, C.A.; Chesler, D.A.

    1985-01-01

    As part of a 30-year program in the development of positron instrumentation, the authors have developed a high resolution bismuth germanate (BGO) ring tomography (PCR) employing 360 detectors and 90 photomultiplier tubes for one plane. The detectors are shaped as trapezoid and are 4 mm wide at the front end. When assembled, they form an essentially continuous cylindrical detector. Light from a scintillation in the detector is viewed through a cylindrical light pipe by the photomultiplier tubes. By use of an analog coding scheme, the detector emitting light is identified from the phototube signals. In effect, each phototube can identify four crystals. PCR is designed as a static device and does not use interpolative motion. This results in considerable advantage when performing dynamic studies. PCR is the positron tomography analog of the γ-camera widely used in nuclear medicine

  16. FY16 ASME High Temperature Code Activities

    Energy Technology Data Exchange (ETDEWEB)

    Swindeman, M. J. [Chromtech Inc., Oak Ridge, TN (United States); Jetter, R. I. [R. I Jetter Consulting, Pebble Beach, CA (United States); Sham, T. -L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-01

    One of the objectives of the ASME high temperature Code activities is to develop and validate both improvements and the basic features of Section III, Division 5, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to be used to assess whether or not a specific component under specified loading conditions will satisfy the elevated temperature design requirements for Class A components in Section III, Division 5, Subsection HB, Subpart B (HBB). There are many features and alternative paths of varying complexity in HBB. The initial focus of this task is a basic path through the various options for a single reference material, 316H stainless steel. However, the program will be structured for eventual incorporation all the features and permitted materials of HBB. Since this task has recently been initiated, this report focuses on the description of the initial path forward and an overall description of the approach to computer program development.

  17. Readings in Wildlife and Fish Conservation, High School Conservation Curriculum Project.

    Science.gov (United States)

    Ensminger, Jack

    This publication is a tentative edition of readings on Wildlife and Fish Conservation in Louisiana, and as such it forms part of one of the four units of study designed for an experimental high school course, the "High School Conservation Curriculum Project." The other three units are concerned with Forest Conervation, Soil and Water…

  18. Conservative performance analysis of a PWR nuclear fuel rod using the FRAPCON code

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Fabio Branco Vaz de; Sabundjian, Gaiane, E-mail: fabio@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    In this paper, some of the preliminary results of the sensitivity and conservative analysis of a hypothetical pressurized water reactor fuel rod are presented, using the FRAPCON code as a basic and preparation tool for the future transient analysis, which will be carried out by the FRAPTRAN code. Emphasis is given to the evaluation of the cladding behavior, since it is one of the critical containment barriers of the fission products, generated during fuel irradiation. Sensitivity analyses were performed by the variation of the values of some parameters, which were mainly related with thermal cycle conditions, and taking into account an intermediate value between the realistic and conservative conditions for the linear heat generation rate parameter, given in literature. Time lengths were taken from typical nuclear power plant operational cycle, adjusted to the obtention of a chosen burnup. Curves of fuel and cladding temperatures, and also for their mechanical and oxidation behavior, as a function of the reactor operation's time, are presented for each one of the nodes considered, over the nuclear fuel rod. Analyzing the curves, it was possible to observe the influence of the thermal cycle on the fuel rod performance, in this preliminary step for the accident/transient analysis. (author)

  19. CONDOR: a database resource of developmentally associated conserved non-coding elements

    Directory of Open Access Journals (Sweden)

    Smith Sarah

    2007-08-01

    Full Text Available Abstract Background Comparative genomics is currently one of the most popular approaches to study the regulatory architecture of vertebrate genomes. Fish-mammal genomic comparisons have proved powerful in identifying conserved non-coding elements likely to be distal cis-regulatory modules such as enhancers, silencers or insulators that control the expression of genes involved in the regulation of early development. The scientific community is showing increasing interest in characterizing the function, evolution and language of these sequences. Despite this, there remains little in the way of user-friendly access to a large dataset of such elements in conjunction with the analysis and the visualization tools needed to study them. Description Here we present CONDOR (COnserved Non-coDing Orthologous Regions available at: http://condor.fugu.biology.qmul.ac.uk. In an interactive and intuitive way the website displays data on > 6800 non-coding elements associated with over 120 early developmental genes and conserved across vertebrates. The database regularly incorporates results of ongoing in vivo zebrafish enhancer assays of the CNEs carried out in-house, which currently number ~100. Included and highlighted within this set are elements derived from duplication events both at the origin of vertebrates and more recently in the teleost lineage, thus providing valuable data for studying the divergence of regulatory roles between paralogs. CONDOR therefore provides a number of tools and facilities to allow scientists to progress in their own studies on the function and evolution of developmental cis-regulation. Conclusion By providing access to data with an approachable graphics interface, the CONDOR database presents a rich resource for further studies into the regulation and evolution of genes involved in early development.

  20. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  1. Divergent evolutionary rates in vertebrate and mammalian specific conserved non-coding elements (CNEs) in echolocating mammals.

    Science.gov (United States)

    Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J

    2014-12-19

    The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise

  2. Building Energy Efficiency in India: Compliance Evaluation of Energy Conservation Building Code

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Sha; Evans, Meredydd; Delgado, Alison

    2014-03-26

    India is experiencing unprecedented construction boom. The country doubled its floorspace between 2001 and 2005 and is expected to add 35 billion m2 of new buildings by 2050. Buildings account for 35% of total final energy consumption in India today, and building energy use is growing at 8% annually. Studies have shown that carbon policies will have little effect on reducing building energy demand. Chaturvedi et al. predicted that, if there is no specific sectoral policies to curb building energy use, final energy demand of the Indian building sector will grow over five times by the end of this century, driven by rapid income and population growth. The growing energy demand in buildings is accompanied by a transition from traditional biomass to commercial fuels, particularly an increase in electricity use. This also leads to a rapid increase in carbon emissions and aggravates power shortage in India. Growth in building energy use poses challenges to the Indian government. To curb energy consumption in buildings, the Indian government issued the Energy Conservation Building Code (ECBC) in 2007, which applies to commercial buildings with a connected load of 100 kW or 120kVA. It is predicted that the implementation of ECBC can help save 25-40% of energy, compared to reference buildings without energy-efficiency measures. However, the impact of ECBC depends on the effectiveness of its enforcement and compliance. Currently, the majority of buildings in India are not ECBC-compliant. The United Nations Development Programme projected that code compliance in India would reach 35% by 2015 and 64% by 2017. Whether the projected targets can be achieved depends on how the code enforcement system is designed and implemented. Although the development of ECBC lies in the hands of the national government – the Bureau of Energy Efficiency under the Ministry of Power, the adoption and implementation of ECBC largely relies on state and local governments. Six years after ECBC

  3. Current status of high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Sasa, Toshinobu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Current status of design code of accelerator (NMTC/JAERI code), outline of physical model and evaluation of accuracy of code were reported. To evaluate the nuclear performance of accelerator and strong spallation neutron origin, the nuclear reaction between high energy proton and target nuclide and behaviors of various produced particles are necessary. The nuclear design of spallation neutron system used a calculation code system connected the high energy nucleon{center_dot}meson transport code and the neutron{center_dot}photon transport code. NMTC/JAERI is described by the particle evaporation process under consideration of competition reaction of intranuclear cascade and fission process. Particle transport calculation was carried out for proton, neutron, {pi}- and {mu}-meson. To verify and improve accuracy of high energy nucleon-meson transport code, data of spallation and spallation neutron fragment by the integral experiment were collected. (S.Y.)

  4. High-Fidelity Coding with Correlated Neurons

    Science.gov (United States)

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  5. Translation Initiation from Conserved Non-AUG Codons Provides Additional Layers of Regulation and Coding Capacity

    Directory of Open Access Journals (Sweden)

    Ivaylo P. Ivanov

    2017-06-01

    Full Text Available Neurospora crassa cpc-1 and Saccharomyces cerevisiae GCN4 are homologs specifying transcription activators that drive the transcriptional response to amino acid limitation. The cpc-1 mRNA contains two upstream open reading frames (uORFs in its >700-nucleotide (nt 5′ leader, and its expression is controlled at the level of translation in response to amino acid starvation. We used N. crassa cell extracts and obtained data indicating that cpc-1 uORF1 and uORF2 are functionally analogous to GCN4 uORF1 and uORF4, respectively, in controlling translation. We also found that the 5′ region upstream of the main coding sequence of the cpc-1 mRNA extends for more than 700 nucleotides without any in-frame stop codon. For 100 cpc-1 homologs from Pezizomycotina and from selected Basidiomycota, 5′ conserved extensions of the CPC1 reading frame are also observed. Multiple non-AUG near-cognate codons (NCCs in the CPC1 reading frame upstream of uORF2, some deeply conserved, could potentially initiate translation. At least four NCCs initiated translation in vitro. In vivo data were consistent with initiation at NCCs to produce N-terminally extended N. crassa CPC1 isoforms. The pivotal role played by CPC1, combined with its translational regulation by uORFs and NCC utilization, underscores the emerging significance of noncanonical initiation events in controlling gene expression.

  6. photon-plasma: A modern high-order particle-in-cell code

    International Nuclear Information System (INIS)

    Haugbølle, Troels; Frederiksen, Jacob Trier; Nordlund, Åke

    2013-01-01

    We present the photon-plasma code, a modern high order charge conserving particle-in-cell code for simulating relativistic plasmas. The code is using a high order implicit field solver and a novel high order charge conserving interpolation scheme for particle-to-cell interpolation and charge deposition. It includes powerful diagnostics tools with on-the-fly particle tracking, synthetic spectra integration, 2D volume slicing, and a new method to correctly account for radiative cooling in the simulations. A robust technique for imposing (time-dependent) particle and field fluxes on the boundaries is also presented. Using a hybrid OpenMP and MPI approach, the code scales efficiently from 8 to more than 250.000 cores with almost linear weak scaling on a range of architectures. The code is tested with the classical benchmarks particle heating, cold beam instability, and two-stream instability. We also present particle-in-cell simulations of the Kelvin-Helmholtz instability, and new results on radiative collisionless shocks

  7. Applications of the Los Alamos High Energy Transport code

    International Nuclear Information System (INIS)

    Waters, L.; Gavron, A.; Prael, R.E.

    1992-01-01

    Simulation codes reliable through a large range of energies are essential to analyze the environment of vehicles and habitats proposed for space exploration. The LAHET monte carlo code has recently been expanded to track high energy hadrons with FLUKA, while retaining the original Los Alamos version of HETC at lower energies. Electrons and photons are transported with EGS4, and an interface to the MCNP monte carlo code is provided to analyze neutrons with kinetic energies less than 20 MeV. These codes are benchmarked by comparison of LAHET/MCNP calculations to data from the Brookhaven experiment E814 participant calorimeter

  8. High energy particle transport code NMTC/JAM

    International Nuclear Information System (INIS)

    Niita, K.; Takada, H.; Meigo, S.; Ikeda, Y.

    2001-01-01

    We have developed a high energy particle transport code NMTC/JAM, which is an upgrade version of NMTC/JAERI97. The available energy range of NMTC/JAM is, in principle, extended to 200 GeV for nucleons and mesons including the high energy nuclear reaction code JAM for the intra-nuclear cascade part. We compare the calculations by NMTC/JAM code with the experimental data of thin and thick targets for proton induced reactions up to several 10 GeV. The results of NMTC/JAM code show excellent agreement with the experimental data. From these code validation, it is concluded that NMTC/JAM is reliable in neutronics optimization study of the high intense spallation neutron utilization facility. (author)

  9. Tilapia and human CLIC2 structures are highly conserved.

    Science.gov (United States)

    Zeng, Jiao; Li, Zhengjun; Lui, Eei Yin; Lam, Siew Hong; Swaminathan, Kunchithapadam

    2018-01-08

    Chloride intracellular channels (CLICs) exist in soluble and membrane bound forms. We have determined the crystal structure of soluble Clic2 from the euryhaline teleost fish Oreochromis mossambicus. Structural comparison of tilapia and human CLIC2 with other CLICs shows that these proteins are highly conserved. We have also compared the expression levels of clic2 in selected osmoregulatory organs of tilapia, acclimated to freshwater, seawater and hypersaline water. Structural conservation of vertebrate CLICs implies that they might play conserved roles. Also, tissue-specific responsiveness of clic2 suggests that it might be involved in iono-osmoregulation under extreme conditions in tilapia. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Comprehensive analysis of coding-lncRNA gene co-expression network uncovers conserved functional lncRNAs in zebrafish.

    Science.gov (United States)

    Chen, Wen; Zhang, Xuan; Li, Jing; Huang, Shulan; Xiang, Shuanglin; Hu, Xiang; Liu, Changning

    2018-05-09

    Zebrafish is a full-developed model system for studying development processes and human disease. Recent studies of deep sequencing had discovered a large number of long non-coding RNAs (lncRNAs) in zebrafish. However, only few of them had been functionally characterized. Therefore, how to take advantage of the mature zebrafish system to deeply investigate the lncRNAs' function and conservation is really intriguing. We systematically collected and analyzed a series of zebrafish RNA-seq data, then combined them with resources from known database and literatures. As a result, we obtained by far the most complete dataset of zebrafish lncRNAs, containing 13,604 lncRNA genes (21,128 transcripts) in total. Based on that, a co-expression network upon zebrafish coding and lncRNA genes was constructed and analyzed, and used to predict the Gene Ontology (GO) and the KEGG annotation of lncRNA. Meanwhile, we made a conservation analysis on zebrafish lncRNA, identifying 1828 conserved zebrafish lncRNA genes (1890 transcripts) that have their putative mammalian orthologs. We also found that zebrafish lncRNAs play important roles in regulation of the development and function of nervous system; these conserved lncRNAs present a significant sequential and functional conservation, with their mammalian counterparts. By integrative data analysis and construction of coding-lncRNA gene co-expression network, we gained the most comprehensive dataset of zebrafish lncRNAs up to present, as well as their systematic annotations and comprehensive analyses on function and conservation. Our study provides a reliable zebrafish-based platform to deeply explore lncRNA function and mechanism, as well as the lncRNA commonality between zebrafish and human.

  11. High-fidelity plasma codes for burn physics

    Energy Technology Data Exchange (ETDEWEB)

    Cooley, James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Graziani, Frank [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Marinak, Marty [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Murillo, Michael [Michigan State Univ., East Lansing, MI (United States)

    2016-10-19

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental data and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.

  12. High data rate coding for the space station telemetry links.

    Science.gov (United States)

    Lumb, D. R.; Viterbi, A. J.

    1971-01-01

    Coding systems for high data rates were examined from the standpoint of potential application in space-station telemetry links. Approaches considered included convolutional codes with sequential, Viterbi, and cascaded-Viterbi decoding. It was concluded that a high-speed (40 Mbps) sequential decoding system best satisfies the requirements for the assumed growth potential and specified constraints. Trade-off studies leading to this conclusion are viewed, and some sequential (Fano) algorithm improvements are discussed, together with real-time simulation results.

  13. Analysis and application of ratcheting evaluation procedure of Japanese high temperature design code DDS

    International Nuclear Information System (INIS)

    Lee, H. Y.; Kim, J. B.; Lee, J. H.

    2002-01-01

    In this study, the evaluation procedure of Japanese DDS code which was recently developed to assess the progressive inelastic deformation occurring under repetition of secondary stresses was analyzed and the evaluation results according to DDS was compared those of the thermal ratchet structural test carried out by KAERI to analyze the conservativeness of the code. The existing high temperature codes of US ASME-NH and French RCC-MR suggest the limited ratcheting procedures for only the load cases of cyclic secondary stresses under primary stresses. So they are improper to apply to the actual ratcheting problem which can occur under cyclic secondary membrane stresses due to the movement of hot free surface for the pool type LMR. DDS provides explicitly an analysis procedure of ratcheting due to moving thermal gradients near hot free surface. A comparison study was carried out between the results by the design code of DDS and by the structural test to investigate the conservativeness of DDS code, which showed that the evaluation results by DDS were in good agreement with those of the structural test

  14. Conservation

    NARCIS (Netherlands)

    Noteboom, H.P.

    1985-01-01

    The IUCN/WWF Plants Conservation Programme 1984 — 1985. World Wildlife Fund chose plants to be the subject of their fund-raising campaign in the period 1984 — 1985. The objectives were to: 1. Use information techniques to achieve the conservation objectives of the Plants Programme – to save plants;

  15. Conservation.

    Science.gov (United States)

    National Audubon Society, New York, NY.

    This set of teaching aids consists of seven Audubon Nature Bulletins, providing the teacher and student with informational reading on various topics in conservation. The bulletins have these titles: Plants as Makers of Soil, Water Pollution Control, The Ground Water Table, Conservation--To Keep This Earth Habitable, Our Threatened Air Supply,…

  16. Highly parallel line-based image coding for many cores.

    Science.gov (United States)

    Peng, Xiulian; Xu, Jizheng; Zhou, You; Wu, Feng

    2012-01-01

    Computers are developing along with a new trend from the dual-core and quad-core processors to ones with tens or even hundreds of cores. Multimedia, as one of the most important applications in computers, has an urgent need to design parallel coding algorithms for compression. Taking intraframe/image coding as a start point, this paper proposes a pure line-by-line coding scheme (LBLC) to meet the need. In LBLC, an input image is processed line by line sequentially, and each line is divided into small fixed-length segments. The compression of all segments from prediction to entropy coding is completely independent and concurrent at many cores. Results on a general-purpose computer show that our scheme can get a 13.9 times speedup with 15 cores at the encoder and a 10.3 times speedup at the decoder. Ideally, such near-linear speeding relation with the number of cores can be kept for more than 100 cores. In addition to the high parallelism, the proposed scheme can perform comparatively or even better than the H.264 high profile above middle bit rates. At near-lossless coding, it outperforms H.264 more than 10 dB. At lossless coding, up to 14% bit-rate reduction is observed compared with H.264 lossless coding at the high 4:4:4 profile.

  17. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  18. High-throughput sequencing, characterization and detection of new and conserved cucumber miRNAs.

    Directory of Open Access Journals (Sweden)

    Germán Martínez

    Full Text Available Micro RNAS (miRNAs are a class of endogenous small non coding RNAs involved in the post-transcriptional regulation of gene expression. In plants, a great number of conserved and specific miRNAs, mainly arising from model species, have been identified to date. However less is known about the diversity of these regulatory RNAs in vegetal species with agricultural and/or horticultural importance. Here we report a combined approach of bioinformatics prediction, high-throughput sequencing data and molecular methods to analyze miRNAs populations in cucumber (Cucumis sativus plants. A set of 19 conserved and 6 known but non-conserved miRNA families were found in our cucumber small RNA dataset. We also identified 7 (3 with their miRNA* strand not previously described miRNAs, candidates to be cucumber-specific. To validate their description these new C. sativus miRNAs were detected by northern blot hybridization. Additionally, potential targets for most conserved and new miRNAs were identified in cucumber genome.In summary, in this study we have identified, by first time, conserved, known non-conserved and new miRNAs arising from an agronomically important species such as C. sativus. The detection of this complex population of regulatory small RNAs suggests that similarly to that observe in other plant species, cucumber miRNAs may possibly play an important role in diverse biological and metabolic processes.

  19. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  20. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Energy Technology Data Exchange (ETDEWEB)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K. [Cray Inc., St. Paul, MN 55101 (United States); Porter, D. [Minnesota Supercomputing Institute for Advanced Computational Research, Minneapolis, MN USA (United States); O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W. [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Edmon, P., E-mail: pjm@cray.com, E-mail: nradclif@cray.com, E-mail: kkandalla@cray.com, E-mail: oneill@astro.umn.edu, E-mail: nolt0040@umn.edu, E-mail: donnert@ira.inaf.it, E-mail: twj@umn.edu, E-mail: dhp@umn.edu, E-mail: pedmon@cfa.harvard.edu [Institute for Theory and Computation, Center for Astrophysics, Harvard University, Cambridge, MA 02138 (United States)

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  1. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    International Nuclear Information System (INIS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Donnert, J. M. F.; Jones, T. W.; Edmon, P.

    2017-01-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  2. Outlines and verifications of the codes used in the safety analysis of High Temperature Engineering Test Reactor (HTTR)

    International Nuclear Information System (INIS)

    Shiina, Yasuaki; Kunitomi, Kazuhiko; Maruyama, Soh; Fujita, Shigeki; Nakagawa, Shigeaki; Iyoku, Tatsuo; Shindoh, Masami; Sudo, Yukio; Hirano, Masashi.

    1990-03-01

    This paper presents brief description of the computer codes used in the safety analysis of High Temperature Engineering Test Reactor. The list of the codes is: 1. BLOOST-J2 2. THYDE-HTGR 3. TAC-NC 4. RATSAM6 5. COMPARE-MOD1 6. GRACE 7. OXIDE-3F 8. FLOWNET/TRUMP. Of described above, 1, 3, 4, 5, 6 and 7 were developed for the multi-hole type gas cooled reactor and improved for HTTR and 2 was originated by THYDE-codes which were developed to treat the transient thermo-hydraulics during LOCA of LWR. Each code adopted the models and properties which yield conservative analytical results. Adequacy of each code was verified by the comparison with the experimental results and/or the analytical results obtained from the other codes which were already proven. (author)

  3. The emerging High Efficiency Video Coding standard (HEVC)

    International Nuclear Information System (INIS)

    Raja, Gulistan; Khan, Awais

    2013-01-01

    High definition video (HDV) is becoming popular day by day. This paper describes the performance analysis of latest upcoming video standard known as High Efficiency Video Coding (HEVC). HEVC is designed to fulfil all the requirements for future high definition videos. In this paper, three configurations (intra only, low delay and random access) of HEVC are analyzed using various 480p, 720p and 1080p high definition test video sequences. Simulation results show the superior objective and subjective quality of HEVC

  4. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  5. Generation of monoclonal antibodies against highly conserved antigens.

    Directory of Open Access Journals (Sweden)

    Hongzhe Zhou

    Full Text Available BACKGROUND: Therapeutic antibody development is one of the fastest growing areas of the pharmaceutical industry. Generating high-quality monoclonal antibodies against a given therapeutic target is very crucial for the success of the drug development. However, due to immune tolerance, some proteins that are highly conserved between mice and humans are not very immunogenic in mice, making it difficult to generate antibodies using a conventional approach. METHODOLOGY/PRINCIPAL FINDINGS: In this report, the impaired immune tolerance of NZB/W mice was exploited to generate monoclonal antibodies against highly conserved or self-antigens. Using two highly conserved human antigens (MIF and HMGB1 and one mouse self-antigen (TNF-alpha as examples, we demonstrate here that multiple clones of high affinity, highly specific antibodies with desired biological activities can be generated, using the NZB/W mouse as the immunization host and a T cell-specific tag fused to a recombinant antigen to stimulate the immune system. CONCLUSIONS/SIGNIFICANCE: We developed an efficient and universal method for generating surrogate or therapeutic antibodies against "difficult antigens" to facilitate the development of therapeutic antibodies.

  6. Enforcing dust mass conservation in 3D simulations of tightly coupled grains with the PHANTOM SPH code

    Science.gov (United States)

    Ballabio, G.; Dipierro, G.; Veronesi, B.; Lodato, G.; Hutchison, M.; Laibe, G.; Price, D. J.

    2018-06-01

    We describe a new implementation of the one-fluid method in the SPH code PHANTOM to simulate the dynamics of dust grains in gas protoplanetary discs. We revise and extend previously developed algorithms by computing the evolution of a new fluid quantity that produces a more accurate and numerically controlled evolution of the dust dynamics. Moreover, by limiting the stopping time of uncoupled grains that violate the assumptions of the terminal velocity approximation, we avoid fatal numerical errors in mass conservation. We test and validate our new algorithm by running 3D SPH simulations of a large range of disc models with tightly and marginally coupled grains.

  7. Phylum-Level Conservation of Regulatory Information in Nematodes despite Extensive Non-coding Sequence Divergence

    Science.gov (United States)

    Gordon, Kacy L.; Arthur, Robert K.; Ruvinsky, Ilya

    2015-01-01

    Gene regulatory information guides development and shapes the course of evolution. To test conservation of gene regulation within the phylum Nematoda, we compared the functions of putative cis-regulatory sequences of four sets of orthologs (unc-47, unc-25, mec-3 and elt-2) from distantly-related nematode species. These species, Caenorhabditis elegans, its congeneric C. briggsae, and three parasitic species Meloidogyne hapla, Brugia malayi, and Trichinella spiralis, represent four of the five major clades in the phylum Nematoda. Despite the great phylogenetic distances sampled and the extensive sequence divergence of nematode genomes, all but one of the regulatory elements we tested are able to drive at least a subset of the expected gene expression patterns. We show that functionally conserved cis-regulatory elements have no more extended sequence similarity to their C. elegans orthologs than would be expected by chance, but they do harbor motifs that are important for proper expression of the C. elegans genes. These motifs are too short to be distinguished from the background level of sequence similarity, and while identical in sequence they are not conserved in orientation or position. Functional tests reveal that some of these motifs contribute to proper expression. Our results suggest that conserved regulatory circuitry can persist despite considerable turnover within cis elements. PMID:26020930

  8. Phylum-Level Conservation of Regulatory Information in Nematodes despite Extensive Non-coding Sequence Divergence.

    Directory of Open Access Journals (Sweden)

    Kacy L Gordon

    2015-05-01

    Full Text Available Gene regulatory information guides development and shapes the course of evolution. To test conservation of gene regulation within the phylum Nematoda, we compared the functions of putative cis-regulatory sequences of four sets of orthologs (unc-47, unc-25, mec-3 and elt-2 from distantly-related nematode species. These species, Caenorhabditis elegans, its congeneric C. briggsae, and three parasitic species Meloidogyne hapla, Brugia malayi, and Trichinella spiralis, represent four of the five major clades in the phylum Nematoda. Despite the great phylogenetic distances sampled and the extensive sequence divergence of nematode genomes, all but one of the regulatory elements we tested are able to drive at least a subset of the expected gene expression patterns. We show that functionally conserved cis-regulatory elements have no more extended sequence similarity to their C. elegans orthologs than would be expected by chance, but they do harbor motifs that are important for proper expression of the C. elegans genes. These motifs are too short to be distinguished from the background level of sequence similarity, and while identical in sequence they are not conserved in orientation or position. Functional tests reveal that some of these motifs contribute to proper expression. Our results suggest that conserved regulatory circuitry can persist despite considerable turnover within cis elements.

  9. Numerical solution of conservation equations in the transient model for the system thermal - hydraulics in the Korsar computer code

    International Nuclear Information System (INIS)

    Yudov, Y.V.

    2001-01-01

    The functional part of the KORSAR computer code is based on the computational unit for the reactor system thermal-hydraulics and other thermal power systems with water cooling. The two-phase flow dynamics of the thermal-hydraulic network is modelled by KORSAR in one-dimensional two-fluid (non-equilibrium and nonhomogeneous) approximation with the same pressure of both phases. Each phase is characterized by parameters averaged over the channel sections, and described by the conservation equations for mass, energy and momentum. The KORSAR computer code relies upon a novel approach to mathematical modelling of two-phase dispersed-annular flows. This approach allows a two-fluid model to differentiate the effects of the liquid film and droplets in the gas core on the flow characteristics. A semi-implicit numerical scheme has been chosen for deriving discrete analogs the conservation equations in KORSAR. In the semi-implicit numerical scheme, solution of finite-difference equations is reduced to the problem of determining the pressure field at a new time level. For the one-channel case, the pressure field is found from the solution of a system of linear algebraic equations by using the tri-diagonal matrix method. In the branched network calculation, the matrix of coefficients in the equations describing the pressure field is no longer tri-diagonal but has a sparseness structure. In this case, the system of linear equations for the pressure field can be solved with any of the known classical methods. Such an approach is implemented in the existing best-estimate thermal-hydraulic computer codes (TRAC, RELAP5, etc.) For the KORSAR computer code, we have developed a new non-iterative method for calculating the pressure field in the network of any topology. This method is based on the tri-diagonal matrix method and performs well when solving the thermal-hydraulic network problems. (author)

  10. BBU code development for high-power microwave generators

    International Nuclear Information System (INIS)

    Houck, T.L.; Westenskow, G.A.; Yu, S.S.

    1992-01-01

    We are developing a two-dimensional, time-dependent computer code for the simulation of transverse instabilities in support of relativistic klystron-two beam accelerator research at LLNL. The code addresses transient effects as well as both cumulative and regenerative beam breakup modes. Although designed specifically for the transport of high current (kA) beams through traveling-wave structures, it is applicable to devices consisting of multiple combinations of standing-wave, traveling-wave, and induction accelerator structures. In this paper we compare code simulations to analytical solutions for the case where there is no rf coupling between cavities, to theoretical scaling parameters for coupled cavity structures, and to experimental data involving beam breakup in the two traveling-wave output structure of our microwave generator. (Author) 4 figs., tab., 5 refs

  11. UNIPIC code for simulations of high power microwave devices

    International Nuclear Information System (INIS)

    Wang Jianguo; Zhang Dianhui; Wang Yue; Qiao Hailiang; Li Xiaoze; Liu Chunliang; Li Yongdong; Wang Hongguang

    2009-01-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  12. UNIPIC code for simulations of high power microwave devices

    Science.gov (United States)

    Wang, Jianguo; Zhang, Dianhui; Liu, Chunliang; Li, Yongdong; Wang, Yue; Wang, Hongguang; Qiao, Hailiang; Li, Xiaoze

    2009-03-01

    In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.

  13. Comparison study of inelastic analysis codes for high temperature structure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Lee, H. Y.; Park, C. K.; Geon, G. P.; Lee, J. H

    2004-02-01

    LMR high temperature structures subjected to operating and transient loadings may exhibit very complex deformation behaviors due to the use of ductile material such as 316SS and the systematic analysis technology of high temperature structure for reliable safety assessment is essential. In this project, comparative study with developed inelastic analysis program NONSTA and the existing analysis codes was performed applying various types of loading including non-proportional loading. The performance of NONSTA was confirmed and the effect of inelastic constants on the analysis result was analyzed. Also, the applicability of the inelastic analysis was enlarged as a result of applying both the developed program and the existing codes to the analyses of the enhanced creep behavior and the elastic follow-up behavior of high temperature structures and the necessary items for improvements were deduced. Further studies on the improvement of NONSTA program and the decision of the proper values of inelastic constants are necessary.

  14. The negative influences of the new brazilian forest code on the conservation of riparian forests

    Directory of Open Access Journals (Sweden)

    Silva Normandes Matos da

    2017-12-01

    Full Text Available More than one million hectares of riparian forests were degraded or altered in Mato Grosso State (Brazil up to 2009. The aim of the research is to set a comparative scenario to show differences in the quantification of environmental liabilities in riparian forest areas resulting from the change in native vegetation protection rules due to the transition between Laws 4771/65 and 12651/2012. Data collection took place in a marginal stretch of Vermelho River in Rondonópolis County, Mato Grosso State. The following data set was taken into consideration: aerial images derived from unmanned aerial vehicle, Rapid Eye satellite images and orbital images hosted at Google Earth. The spatial resolution of those images was compared. The aerial photos composed a mosaic that was photo-interpreted to generate land use and occupation classes. The riparian forest areas of a rural property were used as parameter, and their environmental situation was compared in 05 meter and 100 meter strips. Thus, by taking into consideration the current rules, 23,501 m2 of area ceased to be an environmental liability within the riparian forest and became a consolidated rural area. According to the previous Forest Code, in a different scenario, that is, in a set of rural properties, the public authority would receive USD 68,600.00 in fines. The new Brazilian Forestry Code of 2012, which replaces the previous one made in 1965, exempts those responsible for rural property from regenerating previously deforested native vegetation - an obligation established by older Forest Code. We have shown that the new Forest Code has diminished the legal responsibility of the rural owners in relation to the maintenance of forest fragments in their properties.

  15. ABCE1 is a highly conserved RNA silencing suppressor.

    Directory of Open Access Journals (Sweden)

    Kairi Kärblane

    Full Text Available ATP-binding cassette sub-family E member 1 (ABCE1 is a highly conserved protein among eukaryotes and archaea. Recent studies have identified ABCE1 as a ribosome-recycling factor important for translation termination in mammalian cells, yeast and also archaea. Here we report another conserved function of ABCE1. We have previously described AtRLI2, the homolog of ABCE1 in the plant Arabidopsis thaliana, as an endogenous suppressor of RNA silencing. In this study we show that this function is conserved: human ABCE1 is able to suppress RNA silencing in Nicotiana benthamiana plants, in mammalian HEK293 cells and in the worm Caenorhabditis elegans. Using co-immunoprecipitation and mass spectrometry, we found a number of potential ABCE1-interacting proteins that might support its function as an endogenous suppressor of RNA interference. The interactor candidates are associated with epigenetic regulation, transcription, RNA processing and mRNA surveillance. In addition, one of the identified proteins is translin, which together with its binding partner TRAX supports RNA interference.

  16. Novel Intermode Prediction Algorithm for High Efficiency Video Coding Encoder

    Directory of Open Access Journals (Sweden)

    Chan-seob Park

    2014-01-01

    Full Text Available The joint collaborative team on video coding (JCT-VC is developing the next-generation video coding standard which is called high efficiency video coding (HEVC. In the HEVC, there are three units in block structure: coding unit (CU, prediction unit (PU, and transform unit (TU. The CU is the basic unit of region splitting like macroblock (MB. Each CU performs recursive splitting into four blocks with equal size, starting from the tree block. In this paper, we propose a fast CU depth decision algorithm for HEVC technology to reduce its computational complexity. In 2N×2N PU, the proposed method compares the rate-distortion (RD cost and determines the depth using the compared information. Moreover, in order to speed up the encoding time, the efficient merge SKIP detection method is developed additionally based on the contextual mode information of neighboring CUs. Experimental result shows that the proposed algorithm achieves the average time-saving factor of 44.84% in the random access (RA at Main profile configuration with the HEVC test model (HM 10.0 reference software. Compared to HM 10.0 encoder, a small BD-bitrate loss of 0.17% is also observed without significant loss of image quality.

  17. A high-resolution code for large eddy simulation of incompressible turbulent boundary layer flows

    KAUST Repository

    Cheng, Wan; Samtaney, Ravi

    2014-01-01

    examples to establish the fourth-order accuracy and energy conservation property of the code. Furthermore, we implement a recycling method to generate turbulent inflow. We use the stretched spiral vortex subgrid-scale model and virtual wall model

  18. Energy Conservation Tests of a Coupled Kinetic-kinetic Plasma-neutral Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Stotler, D. P.; Chang, C. S.; Ku, S. H.; Lang, J.; Park, G.

    2012-08-29

    A Monte Carlo neutral transport routine, based on DEGAS2, has been coupled to the guiding center ion-electron-neutral neoclassical PIC code XGC0 to provide a realistic treatment of neutral atoms and molecules in the tokamak edge plasma. The DEGAS2 routine allows detailed atomic physics and plasma-material interaction processes to be incorporated into these simulations. The spatial pro le of the neutral particle source used in the DEGAS2 routine is determined from the uxes of XGC0 ions to the material surfaces. The kinetic-kinetic plasma-neutral transport capability is demonstrated with example pedestal fueling simulations.

  19. High energy particle transport code NMTC/JAM

    International Nuclear Information System (INIS)

    Niita, Koji; Meigo, Shin-ichiro; Takada, Hiroshi; Ikeda, Yujiro

    2001-03-01

    We have developed a high energy particle transport code NMTC/JAM, which is an upgraded version of NMTC/JAERI97. The applicable energy range of NMTC/JAM is extended in principle up to 200 GeV for nucleons and mesons by introducing the high energy nuclear reaction code JAM for the intra-nuclear cascade part. For the evaporation and fission process, we have also implemented a new model, GEM, by which the light nucleus production from the excited residual nucleus can be described. According to the extension of the applicable energy, we have upgraded the nucleon-nucleus non-elastic, elastic and differential elastic cross section data by employing new systematics. In addition, the particle transport in a magnetic field has been implemented for the beam transport calculations. In this upgrade, some new tally functions are added and the format of input of data has been improved very much in a user friendly manner. Due to the implementation of these new calculation functions and utilities, consequently, NMTC/JAM enables us to carry out reliable neutronics study of a large scale target system with complex geometry more accurately and easily than before. This report serves as a user manual of the code. (author)

  20. Improved entropy encoding for high efficient video coding standard

    Directory of Open Access Journals (Sweden)

    B.S. Sunil Kumar

    2018-03-01

    Full Text Available The High Efficiency Video Coding (HEVC has better coding efficiency, but the encoding performance has to be improved to meet the growing multimedia applications. This paper improves the standard entropy encoding by introducing the optimized weighing parameters, so that higher rate of compression can be accomplished over the standard entropy encoding. The optimization is performed using the recently introduced firefly algorithm. The experimentation is carried out using eight benchmark video sequences and the PSNR for varying rate of data transmission is investigated. Comparative analysis based on the performance statistics is made with the standard entropy encoding. From the obtained results, it is clear that the originality of the decoded video sequence is preserved far better than the proposed method, though the compression rate is increased. Keywords: Entropy, Encoding, HEVC, PSNR, Compression

  1. High Efficiency EBCOT with Parallel Coding Architecture for JPEG2000

    Directory of Open Access Journals (Sweden)

    Chiang Jen-Shiun

    2006-01-01

    Full Text Available This work presents a parallel context-modeling coding architecture and a matching arithmetic coder (MQ-coder for the embedded block coding (EBCOT unit of the JPEG2000 encoder. Tier-1 of the EBCOT consumes most of the computation time in a JPEG2000 encoding system. The proposed parallel architecture can increase the throughput rate of the context modeling. To match the high throughput rate of the parallel context-modeling architecture, an efficient pipelined architecture for context-based adaptive arithmetic encoder is proposed. This encoder of JPEG2000 can work at 180 MHz to encode one symbol each cycle. Compared with the previous context-modeling architectures, our parallel architectures can improve the throughput rate up to 25%.

  2. The Number, Organization, and Size of Polymorphic Membrane Protein Coding Sequences as well as the Most Conserved Pmp Protein Differ within and across Chlamydia Species.

    Science.gov (United States)

    Van Lent, Sarah; Creasy, Heather Huot; Myers, Garry S A; Vanrompay, Daisy

    2016-01-01

    Variation is a central trait of the polymorphic membrane protein (Pmp) family. The number of pmp coding sequences differs between Chlamydia species, but it is unknown whether the number of pmp coding sequences is constant within a Chlamydia species. The level of conservation of the Pmp proteins has previously only been determined for Chlamydia trachomatis. As different Pmp proteins might be indispensible for the pathogenesis of different Chlamydia species, this study investigated the conservation of Pmp proteins both within and across C. trachomatis,C. pneumoniae,C. abortus, and C. psittaci. The pmp coding sequences were annotated in 16 C. trachomatis, 6 C. pneumoniae, 2 C. abortus, and 16 C. psittaci genomes. The number and organization of polymorphic membrane coding sequences differed within and across the analyzed Chlamydia species. The length of coding sequences of pmpA,pmpB, and pmpH was conserved among all analyzed genomes, while the length of pmpE/F and pmpG, and remarkably also of the subtype pmpD, differed among the analyzed genomes. PmpD, PmpA, PmpH, and PmpA were the most conserved Pmp in C. trachomatis,C. pneumoniae,C. abortus, and C. psittaci, respectively. PmpB was the most conserved Pmp across the 4 analyzed Chlamydia species. © 2016 S. Karger AG, Basel.

  3. High explosive programmed burn in the FLAG code

    Energy Technology Data Exchange (ETDEWEB)

    Mandell, D.; Burton, D.; Lund, C.

    1998-02-01

    The models used to calculate the programmed burn high-explosive lighting times for two- and three-dimensions in the FLAG code are described. FLAG uses an unstructured polyhedra grid. The calculations were compared to exact solutions for a square in two dimensions and for a cube in three dimensions. The maximum error was 3.95 percent in two dimensions and 4.84 percent in three dimensions. The high explosive lighting time model described has the advantage that only one cell at a time needs to be considered.

  4. Structure-aided prediction of mammalian transcription factor complexes in conserved non-coding elements

    KAUST Repository

    Guturu, H.

    2013-11-11

    Mapping the DNA-binding preferences of transcription factor (TF) complexes is critical for deciphering the functions of cis-regulatory elements. Here, we developed a computational method that compares co-occurring motif spacings in conserved versus unconserved regions of the human genome to detect evolutionarily constrained binding sites of rigid TF complexes. Structural data were used to estimate TF complex physical plausibility, explore overlapping motif arrangements seldom tackled by non-structure-aware methods, and generate and analyse three-dimensional models of the predicted complexes bound to DNA. Using this approach, we predicted 422 physically realistic TF complex motifs at 18% false discovery rate, the majority of which (326, 77%) contain some sequence overlap between binding sites. The set of mostly novel complexes is enriched in known composite motifs, predictive of binding site configurations in TF-TF-DNA crystal structures, and supported by ChIP-seq datasets. Structural modelling revealed three cooperativity mechanisms: direct protein-protein interactions, potentially indirect interactions and \\'through-DNA\\' interactions. Indeed, 38% of the predicted complexes were found to contain four or more bases in which TF pairs appear to synergize through overlapping binding to the same DNA base pairs in opposite grooves or strands. Our TF complex and associated binding site predictions are available as a web resource at http://bejerano.stanford.edu/complex.

  5. Structure-aided prediction of mammalian transcription factor complexes in conserved non-coding elements

    KAUST Repository

    Guturu, H.; Doxey, A. C.; Wenger, A. M.; Bejerano, G.

    2013-01-01

    Mapping the DNA-binding preferences of transcription factor (TF) complexes is critical for deciphering the functions of cis-regulatory elements. Here, we developed a computational method that compares co-occurring motif spacings in conserved versus unconserved regions of the human genome to detect evolutionarily constrained binding sites of rigid TF complexes. Structural data were used to estimate TF complex physical plausibility, explore overlapping motif arrangements seldom tackled by non-structure-aware methods, and generate and analyse three-dimensional models of the predicted complexes bound to DNA. Using this approach, we predicted 422 physically realistic TF complex motifs at 18% false discovery rate, the majority of which (326, 77%) contain some sequence overlap between binding sites. The set of mostly novel complexes is enriched in known composite motifs, predictive of binding site configurations in TF-TF-DNA crystal structures, and supported by ChIP-seq datasets. Structural modelling revealed three cooperativity mechanisms: direct protein-protein interactions, potentially indirect interactions and 'through-DNA' interactions. Indeed, 38% of the predicted complexes were found to contain four or more bases in which TF pairs appear to synergize through overlapping binding to the same DNA base pairs in opposite grooves or strands. Our TF complex and associated binding site predictions are available as a web resource at http://bejerano.stanford.edu/complex.

  6. High-resolution finite-difference algorithms for conservation laws

    International Nuclear Information System (INIS)

    Towers, J.D.

    1987-01-01

    A new class of Total Variation Decreasing (TVD) schemes for 2-dimensional scalar conservation laws is constructed using either flux-limited or slope-limited numerical fluxes. The schemes are proven to have formal second-order accuracy in regions where neither u/sub x/ nor y/sub y/ vanishes. A new class of high-resolution large-time-step TVD schemes is constructed by adding flux-limited correction terms to the first-order accurate large-time-step version of the Engquist-Osher scheme. The use of the transport-collapse operator in place of the exact solution operator for the construction of difference schemes is studied. The production of spurious extrema by difference schemes is studied. A simple condition guaranteeing the nonproduction of spurious extrema is derived. A sufficient class of entropy inequalities for a conservation law with a flux having a single inflection point is presented. Finite-difference schemes satisfying a discrete version of each entropy inequality are only first-order accurate

  7. Dynamic Epigenetic Control of Highly Conserved Noncoding Elements

    KAUST Repository

    Seridi, Loqmane

    2014-10-07

    Background Many noncoding genomic loci have remained constant over long evolutionary periods, suggesting that they are exposed to strong selective pressures. The molecular functions of these elements have been partially elucidated, but the fundamental reason for their extreme conservation is still unknown. Results To gain new insights into the extreme selection of highly conserved noncoding elements (HCNEs), we used a systematic analysis of multi-omic data to study the epigenetic regulation of such elements during the development of Drosophila melanogaster. At the sequence level, HCNEs are GC-rich and have a characteristic oligomeric composition. They have higher levels of stable nucleosome occupancy than their flanking regions, and lower levels of mononucleosomes and H3.3, suggesting that these regions reside in compact chromatin. Furthermore, these regions showed remarkable modulations in histone modification and the expression levels of adjacent genes during development. Although HCNEs are primarily initiated late in replication, about 10% were related to early replication origins. Finally, HCNEs showed strong enrichment within lamina-associated domains. Conclusion HCNEs have distinct and protective sequence properties, undergo dynamic epigenetic regulation, and appear to be associated with the structural components of the chromatin, replication origins, and nuclear matrix. These observations indicate that such elements are likely to have essential cellular functions, and offer insights into their epigenetic properties.

  8. Dynamic Epigenetic Control of Highly Conserved Noncoding Elements

    KAUST Repository

    Seridi, Loqmane; Ryu, Tae Woo; Ravasi, Timothy

    2014-01-01

    Background Many noncoding genomic loci have remained constant over long evolutionary periods, suggesting that they are exposed to strong selective pressures. The molecular functions of these elements have been partially elucidated, but the fundamental reason for their extreme conservation is still unknown. Results To gain new insights into the extreme selection of highly conserved noncoding elements (HCNEs), we used a systematic analysis of multi-omic data to study the epigenetic regulation of such elements during the development of Drosophila melanogaster. At the sequence level, HCNEs are GC-rich and have a characteristic oligomeric composition. They have higher levels of stable nucleosome occupancy than their flanking regions, and lower levels of mononucleosomes and H3.3, suggesting that these regions reside in compact chromatin. Furthermore, these regions showed remarkable modulations in histone modification and the expression levels of adjacent genes during development. Although HCNEs are primarily initiated late in replication, about 10% were related to early replication origins. Finally, HCNEs showed strong enrichment within lamina-associated domains. Conclusion HCNEs have distinct and protective sequence properties, undergo dynamic epigenetic regulation, and appear to be associated with the structural components of the chromatin, replication origins, and nuclear matrix. These observations indicate that such elements are likely to have essential cellular functions, and offer insights into their epigenetic properties.

  9. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  10. Coded aperture subreflector array for high resolution radar imaging

    Science.gov (United States)

    Lynch, Jonathan J.; Herrault, Florian; Kona, Keerti; Virbila, Gabriel; McGuire, Chuck; Wetzel, Mike; Fung, Helen; Prophet, Eric

    2017-05-01

    HRL Laboratories has been developing a new approach for high resolution radar imaging on stationary platforms. High angular resolution is achieved by operating at 235 GHz and using a scalable tile phased array architecture that has the potential to realize thousands of elements at an affordable cost. HRL utilizes aperture coding techniques to minimize the size and complexity of the RF electronics needed for beamforming, and wafer level fabrication and integration allow tiles containing 1024 elements to be manufactured with reasonable costs. This paper describes the results of an initial feasibility study for HRL's Coded Aperture Subreflector Array (CASA) approach for a 1024 element micromachined antenna array with integrated single-bit phase shifters. Two candidate electronic device technologies were evaluated over the 170 - 260 GHz range, GaN HEMT transistors and GaAs Schottky diodes. Array structures utilizing silicon micromachining and die bonding were evaluated for etch and alignment accuracy. Finally, the overall array efficiency was estimated to be about 37% (not including spillover losses) using full wave array simulations and measured device performance, which is a reasonable value at 235 GHz. Based on the measured data we selected GaN HEMT devices operated passively with 0V drain bias due to their extremely low DC power dissipation.

  11. High efficiency video coding (HEVC) algorithms and architectures

    CERN Document Server

    Budagavi, Madhukar; Sullivan, Gary

    2014-01-01

    This book provides developers, engineers, researchers and students with detailed knowledge about the High Efficiency Video Coding (HEVC) standard. HEVC is the successor to the widely successful H.264/AVC video compression standard, and it provides around twice as much compression as H.264/AVC for the same level of quality. The applications for HEVC will not only cover the space of the well-known current uses and capabilities of digital video – they will also include the deployment of new services and the delivery of enhanced video quality, such as ultra-high-definition television (UHDTV) and video with higher dynamic range, wider range of representable color, and greater representation precision than what is typically found today. HEVC is the next major generation of video coding design – a flexible, reliable and robust solution that will support the next decade of video applications and ease the burden of video on world-wide network traffic. This book provides a detailed explanation of the various parts ...

  12. Construction of Short-length High-rates Ldpc Codes Using Difference Families

    OpenAIRE

    Deny Hamdani; Ery Safrianti

    2007-01-01

    Low-density parity-check (LDPC) code is linear-block error-correcting code defined by sparse parity-check matrix. It isdecoded using the massage-passing algorithm, and in many cases, capable of outperforming turbo code. This paperpresents a class of low-density parity-check (LDPC) codes showing good performance with low encoding complexity.The code is constructed using difference families from combinatorial design. The resulting code, which is designed tohave short code length and high code r...

  13. Highly conserved small subunit residues influence rubisco large subunit catalysis.

    Science.gov (United States)

    Genkov, Todor; Spreitzer, Robert J

    2009-10-30

    The chloroplast enzyme ribulose 1,5-bisphosphate carboxylase/oxygenase (Rubisco) catalyzes the rate-limiting step of photosynthetic CO(2) fixation. With a deeper understanding of its structure-function relationships and competitive inhibition by O(2), it may be possible to engineer an increase in agricultural productivity and renewable energy. The chloroplast-encoded large subunits form the active site, but the nuclear-encoded small subunits can also influence catalytic efficiency and CO(2)/O(2) specificity. To further define the role of the small subunit in Rubisco function, the 10 most conserved residues in all small subunits were substituted with alanine by transformation of a Chlamydomonas reinhardtii mutant that lacks the small subunit gene family. All the mutant strains were able to grow photosynthetically, indicating that none of the residues is essential for function. Three of the substitutions have little or no effect (S16A, P19A, and E92A), one primarily affects holoenzyme stability (L18A), and the remainder affect catalysis with or without some level of associated structural instability (Y32A, E43A, W73A, L78A, P79A, and F81A). Y32A and E43A cause decreases in CO(2)/O(2) specificity. Based on the x-ray crystal structure of Chlamydomonas Rubisco, all but one (Glu-92) of the conserved residues are in contact with large subunits and cluster near the amino- or carboxyl-terminal ends of large subunit alpha-helix 8, which is a structural element of the alpha/beta-barrel active site. Small subunit residues Glu-43 and Trp-73 identify a possible structural connection between active site alpha-helix 8 and the highly variable small subunit loop between beta-strands A and B, which can also influence Rubisco CO(2)/O(2) specificity.

  14. The WARP Code: Modeling High Intensity Ion Beams

    International Nuclear Information System (INIS)

    Grote, David P.; Friedman, Alex; Vay, Jean-Luc; Haber, Irving

    2005-01-01

    The Warp code, developed for heavy-ion driven inertial fusion energy studies, is used to model high intensity ion (and electron) beams. Significant capability has been incorporated in Warp, allowing nearly all sections of an accelerator to be modeled, beginning with the source. Warp has as its core an explicit, three-dimensional, particle-in-cell model. Alongside this is a rich set of tools for describing the applied fields of the accelerator lattice, and embedded conducting surfaces (which are captured at sub-grid resolution). Also incorporated are models with reduced dimensionality: an axisymmetric model and a transverse ''slice'' model. The code takes advantage of modern programming techniques, including object orientation, parallelism, and scripting (via Python). It is at the forefront in the use of the computational technique of adaptive mesh refinement, which has been particularly successful in the area of diode and injector modeling, both steady-state and time-dependent. In the presentation, some of the major aspects of Warp will be overviewed, especially those that could be useful in modeling ECR sources. Warp has been benchmarked against both theory and experiment. Recent results will be presented showing good agreement of Warp with experimental results from the STS500 injector test stand

  15. The WARP Code: Modeling High Intensity Ion Beams

    International Nuclear Information System (INIS)

    Grote, D P; Friedman, A; Vay, J L; Haber, I

    2004-01-01

    The Warp code, developed for heavy-ion driven inertial fusion energy studies, is used to model high intensity ion (and electron) beams. Significant capability has been incorporated in Warp, allowing nearly all sections of an accelerator to be modeled, beginning with the source. Warp has as its core an explicit, three-dimensional, particle-in-cell model. Alongside this is a rich set of tools for describing the applied fields of the accelerator lattice, and embedded conducting surfaces (which are captured at sub-grid resolution). Also incorporated are models with reduced dimensionality: an axisymmetric model and a transverse ''slice'' model. The code takes advantage of modern programming techniques, including object orientation, parallelism, and scripting (via Python). It is at the forefront in the use of the computational technique of adaptive mesh refinement, which has been particularly successful in the area of diode and injector modeling, both steady-state and time-dependent. In the presentation, some of the major aspects of Warp will be overviewed, especially those that could be useful in modeling ECR sources. Warp has been benchmarked against both theory and experiment. Recent results will be presented showing good agreement of Warp with experimental results from the STS500 injector test stand. Additional information can be found on the web page http://hif.lbl.gov/theory/WARP( ) summary.html

  16. Conservative Analysis of TOP and LOF for KALIMER-600 with the SSC-K code

    International Nuclear Information System (INIS)

    Jeong, H. Y.; Ha, K. S.; Kwon, Y. M.; Suk, S. D.; Lee, K. L.; Lee, Y. B.; Cho, C. H.

    2009-01-01

    KALIMER-600 is designed to satisfy the safety principle of a defense-in-depth and also the safety design objectives which have been established to implement the safety principle in the design. Highly reliable diversified shutdown mechanisms are equipped for the reactivity control function during an accident or abnormal transients in KALIMER-600. The reactivity is also controlled by the inherent reactivity feedback mechanisms incorporated in the design. In addition, a uniquely designed passive decay heat removal circuit provides the heat removal function. Due to these passive and inherent safety characteristics, the safety of KALIMER-600 is much improved than the existing PWR designs. Therefore, the events whose frequencies are higher than 10 -7 per reactor-year are categorized as design basis events (DBEs). The safety analysis has been performed for the TOP and LOF events which are two most important DBEs in KALIMER-600. The analysis results show that the fuel, clad, and the coolant temperatures are well within the safety limit temperatures. Therefore, the KALIMER-600 design fulfills the design basis safety criteria with no fuel damage and no threat to its structural integrity during the transients. Through the analysis, it is clearly shown that the KALIMER-600 design maintains its safety functions required for the mitigation of accidents with an appropriate margin. Therefore, it is concluded that the KALIMER-600 breakeven core design ensures the safety margins for the considered DBEs

  17. Rate adaptive multilevel coded modulation with high coding gain in intensity modulation direct detection optical communication

    Science.gov (United States)

    Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao

    2018-02-01

    A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.

  18. Particle In Cell Codes on Highly Parallel Architectures

    Science.gov (United States)

    Tableman, Adam

    2014-10-01

    We describe strategies and examples of Particle-In-Cell Codes running on Nvidia GPU and Intel Phi architectures. This includes basic implementations in skeletons codes and full-scale development versions (encompassing 1D, 2D, and 3D codes) in Osiris. Both the similarities and differences between Intel's and Nvidia's hardware will be examined. Work supported by grants NSF ACI 1339893, DOE DE SC 000849, DOE DE SC 0008316, DOE DE NA 0001833, and DOE DE FC02 04ER 54780.

  19. The KFA-Version of the high-energy transport code HETC and the generalized evaluation code SIMPEL

    International Nuclear Information System (INIS)

    Cloth, P.; Filges, D.; Sterzenbach, G.; Armstrong, T.W.; Colborn, B.L.

    1983-03-01

    This document describes the updates that have been made to the high-energy transport code HETC for use in the German spallation-neutron source project SNQ. Performance and purpose of the subsidiary code SIMPEL that has been written for general analysis of the HETC output are also described. In addition means of coupling to low energy transport programs, such as the Monte-Carlo code MORSE is provided. As complete input descriptions for HETC and SIMPEL are given together with a sample problem, this document can serve as a user's manual for these two codes. The document is also an answer to the demand that has been issued by a greater community of HETC users on the ICANS-IV meeting, Oct 20-24 1980, Tsukuba-gun, Japan for a complete description of at least one single version of HETC among the many different versions that exist. (orig.)

  20. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  1. Validation of SCALE code package on high performance neutron shields

    International Nuclear Information System (INIS)

    Bace, M.; Jecmenica, R.; Smuc, T.

    1999-01-01

    The shielding ability and other properties of new high performance neutron shielding materials from the KRAFTON series have been recently published. A comparison of the published experimental and MCNP results for the two materials of the KRAFTON series, with our own calculations has been done. Two control modules of the SCALE-4.4 code system have been used, one of them based on one dimensional radiation transport analysis (SAS1) and other based on the three dimensional Monte Carlo method (SAS3). The comparison of the calculated neutron dose equivalent rates shows a good agreement between experimental and calculated results for the KRAFTON-N2 material.. Our results indicate that the N2-M-N2 sandwich type is approximately 10% inferior as neutron shield to the KRAFTON-N2 material. All values of neutron dose equivalent obtained by SAS1 are approximately 25% lower in comparison with the SAS3 results, which indicates proportions of discrepancies introduced by one-dimensional geometry approximation.(author)

  2. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  3. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer; Ghanem, Bernard

    2017-01-01

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images

  4. Eucaryotic operon genes can define highly conserved syntenies

    Czech Academy of Sciences Publication Activity Database

    Trachtulec, Zdeněk

    2004-01-01

    Roč. 50, - (2004), s. 1-6 ISSN 0015-5500 R&D Projects: GA ČR GA204/01/0997; GA MŠk LN00A079 Institutional research plan: CEZ:AV0Z5052915 Keywords : eukaryotic operon * conserved synteny Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 0.507, year: 2004

  5. Construction of Short-Length High-Rates LDPC Codes Using Difference Families

    Directory of Open Access Journals (Sweden)

    Deny Hamdani

    2010-10-01

    Full Text Available Low-density parity-check (LDPC code is linear-block error-correcting code defined by sparse parity-check matrix. It is decoded using the massage-passing algorithm, and in many cases, capable of outperforming turbo code. This paper presents a class of low-density parity-check (LDPC codes showing good performance with low encoding complexity. The code is constructed using difference families from  combinatorial design. The resulting code, which is designed to have short code length and high code rate, can be encoded with low complexity due to its quasi-cyclic structure, and performs well when it is iteratively decoded with the sum-product algorithm. These properties of LDPC code are quite suitable for applications in future wireless local area network.

  6. A new relativistic viscous hydrodynamics code and its application to the Kelvin-Helmholtz instability in high-energy heavy-ion collisions

    Science.gov (United States)

    Okamoto, Kazuhisa; Nonaka, Chiho

    2017-06-01

    We construct a new relativistic viscous hydrodynamics code optimized in the Milne coordinates. We split the conservation equations into an ideal part and a viscous part, using the Strang spitting method. In the code a Riemann solver based on the two-shock approximation is utilized for the ideal part and the Piecewise Exact Solution (PES) method is applied for the viscous part. We check the validity of our numerical calculations by comparing analytical solutions, the viscous Bjorken's flow and the Israel-Stewart theory in Gubser flow regime. Using the code, we discuss possible development of the Kelvin-Helmholtz instability in high-energy heavy-ion collisions.

  7. Critical review of conservation equations for two-phase flow in the U.S. NRC TRACE code

    International Nuclear Information System (INIS)

    Wulff, Wolfgang

    2011-01-01

    Research highlights: → Field equations as implemented in TRACE are incorrect. → Boundary conditions needed for cooling of nuclear fuel elements are wrong. → The two-fluid model in TRACE is not closed. → Three-dimensional flow modeling in TRACE has no basis. - Abstract: The field equations for two-phase flow in the computer code TRAC/RELAP Advanced Computational Engine or TRACE are examined to determine their validity, their capabilities and limitations in resolving nuclear reactor safety issues. TRACE was developed for the NRC to predict thermohydraulic phenomena in nuclear power plants during operational transients and postulated accidents. TRACE is based on the rigorously derived and well-established two-fluid field equations for 1-D and 3-D two-phase flow. It is shown that: (1)The two-fluid field equations for mass conservation as implemented in TRACE are wrong because local mass balances in TRACE are in conflict with mass conservation for the whole reactor system, as shown in Section . (2)Wrong equations of motion are used in TRACE in place of momentum balances, compromising at branch points the prediction of momentum transfer between, and the coupling of, loops in hydraulic networks by impedance (form loss and wall shear) and by inertia and thereby the simulation of reactor component interactions. (3)Most seriously, TRACE calculation of heat transfer from fuel elements is incorrect for single and two-phase flows, because Eq. of the TRACE Manual is wrong (see Section ). (4)Boundary conditions for momentum and energy balances in TRACE are restricted to flow regimes with single-phase wall contact because TRACE lacks constitutive relations for solid-fluid exchange of momentum and heat in prevailing flow regimes. Without a quantified assessment of consequences from (3) to (4), predictions of phasic fluid velocities, fuel temperatures and important safety parameters, e.g., peak clad temperature, are questionable. Moreover, TRACE cannot predict 3-D single- or

  8. Energy conservation and pomeron loops in high energy evolution

    International Nuclear Information System (INIS)

    Gustafson, Goesta

    2007-01-01

    We present a formalism which modifies the Mueller Dipole Model such that it incorporates energy-momentum conservation as well as important colour suppressed effects in the cascade evolution. The formalism is implemented in a Monte Carlo simulation program, and the results are compared to inclusive data from HERA and the Tevatron. We here find a generally very good agreement between our model and the experimental data. (author)

  9. RNA expression in a cartilaginous fish cell line reveals ancient 3′ noncoding regions highly conserved in vertebrates

    Science.gov (United States)

    Forest, David; Nishikawa, Ryuhei; Kobayashi, Hiroshi; Parton, Angela; Bayne, Christopher J.; Barnes, David W.

    2007-01-01

    We have established a cartilaginous fish cell line [Squalus acanthias embryo cell line (SAE)], a mesenchymal stem cell line derived from the embryo of an elasmobranch, the spiny dogfish shark S. acanthias. Elasmobranchs (sharks and rays) first appeared >400 million years ago, and existing species provide useful models for comparative vertebrate cell biology, physiology, and genomics. Comparative vertebrate genomics among evolutionarily distant organisms can provide sequence conservation information that facilitates identification of critical coding and noncoding regions. Although these genomic analyses are informative, experimental verification of functions of genomic sequences depends heavily on cell culture approaches. Using ESTs defining mRNAs derived from the SAE cell line, we identified lengthy and highly conserved gene-specific nucleotide sequences in the noncoding 3′ UTRs of eight genes involved in the regulation of cell growth and proliferation. Conserved noncoding 3′ mRNA regions detected by using the shark nucleotide sequences as a starting point were found in a range of other vertebrate orders, including bony fish, birds, amphibians, and mammals. Nucleotide identity of shark and human in these regions was remarkably well conserved. Our results indicate that highly conserved gene sequences dating from the appearance of jawed vertebrates and representing potential cis-regulatory elements can be identified through the use of cartilaginous fish as a baseline. Because the expression of genes in the SAE cell line was prerequisite for their identification, this cartilaginous fish culture system also provides a physiologically valid tool to test functional hypotheses on the role of these ancient conserved sequences in comparative cell biology. PMID:17227856

  10. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  11. Usherin expression is highly conserved in mouse and human tissues.

    Science.gov (United States)

    Pearsall, Nicole; Bhattacharya, Gautam; Wisecarver, Jim; Adams, Joe; Cosgrove, Dominic; Kimberling, William

    2002-12-01

    Usher syndrome is an autosomal recessive disease that results in varying degrees of hearing loss and retinitis pigmentosa. Three types of Usher syndrome (I, II, and III) have been identified clinically with Usher type II being the most common of the three types. Usher type II has been localized to three different chromosomes 1q41, 3p, and 5q, corresponding to Usher type 2A, 2B, and 2C respectively. Usherin is a basement membrane protein encoded by the USH2A gene. Expression of usherin has been localized in the basement membrane of several tissues, however it is not ubiquitous. Immunohistochemistry detected usherin in the following human tissues: retina, cochlea, small and large intestine, pancreas, bladder, prostate, esophagus, trachea, thymus, salivary glands, placenta, ovary, fallopian tube, uterus, and testis. Usherin was absent in many other tissues such as heart, lung, liver, kidney, and brain. This distribution is consistent with the usherin distribution seen in the mouse. Conservation of usherin is also seen at the nucleotide and amino acid level when comparing the mouse and human gene sequences. Evolutionary conservation of usherin expression at the molecular level and in tissues unaffected by Usher 2a supports the important structural and functional role this protein plays in the human. In addition, we believe that these results could lead to a diagnostic procedure for the detection of Usher syndrome and those who carry an USH2A mutation.

  12. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer

    2017-12-25

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images independently. However, learning multidimensional dictionaries and sparse codes for the reconstruction of multi-dimensional data is very important, as it examines correlations among all the data jointly. This provides more capacity for the learned dictionaries to better reconstruct data. In this paper, we propose a generic and novel formulation for the CSC problem that can handle an arbitrary order tensor of data. Backed with experimental results, our proposed formulation can not only tackle applications that are not possible with standard CSC solvers, including colored video reconstruction (5D- tensors), but it also performs favorably in reconstruction with much fewer parameters as compared to naive extensions of standard CSC to multiple features/channels.

  13. Development of THYDE-HTGR: computer code for transient thermal-hydraulics of high-temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Hirano, Masashi; Hada, Kazuhiko

    1990-04-01

    The THYDE-HTGR code has been developed for transient thermal-hydraulic analyses of high-temperature gas-cooled reactors, based on the THYDE-W code. THYDE-W is a code developed at JAERI for the simulation of Light Water Reactor plant dynamics during various types of transients including loss-of-coolant accidents. THYDE-HTGR solves the conservation equations of mass, momentum and energy for compressible gas, or single-phase or two-phase flow. The major code modification from THYDE-W is to treat helium loops as well as water loops. In parallel to this, modification has been made for the neutron kinetics to be applicable to helium-cooled graphite-moderated reactors, for the heat transfer models to be applicable to various types of heat exchangers, and so forth. In order to assess the validity of the modifications, analyses of some of the experiments conducted at the High Temperature Test Loop of ERANS have been performed. In this report, the models applied in THYDE-HTGR are described focusing on the present modifications and the results from the assessment calculations are presented. (author)

  14. Comparison of the ENIGMA code with experimental data on thermal performance, stable fission gas and iodine release at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Killeen, J C [Nuclear Electric plc, Barnwood (United Kingdom)

    1997-08-01

    The predictions of the ENIGMA code have been compared with data from high burn-up fuel experiments from the Halden and RISO reactors. The experiments modelled were IFA-504 and IFA-558 from Halden and the test II-5 from the RISO power burnup test series. The code has well modelled the fuel thermal performance and has provided a good measure of iodine release from pre-interlinked fuel. After interlinkage the iodine predictions remain a good fit for one experiment, but there is significant overprediction for a second experiment (IFA-558). Stable fission gas release is also well modelled and the predictions are within the expected uncertainly band throughout the burn-up range. This report presents code predictions for stable fission gas release to 40GWd/tU, iodine release measurements to 50GWd/tU and thermal performance (fuel centre temperature) to 55GWd/tU. Fuel ratings of up to 38kW/m were modelled at the high burn-up levels. The code is shown to accurately or conservatively predict all these parameters. (author). 1 ref., 6 figs.

  15. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  16. A Common histone modification code on C4 genes in maize and its conservation in Sorghum and Setaria italica.

    Science.gov (United States)

    Heimann, Louisa; Horst, Ina; Perduns, Renke; Dreesen, Björn; Offermann, Sascha; Peterhansel, Christoph

    2013-05-01

    C4 photosynthesis evolved more than 60 times independently in different plant lineages. Each time, multiple genes were recruited into C4 metabolism. The corresponding promoters acquired new regulatory features such as high expression, light induction, or cell type-specific expression in mesophyll or bundle sheath cells. We have previously shown that histone modifications contribute to the regulation of the model C4 phosphoenolpyruvate carboxylase (C4-Pepc) promoter in maize (Zea mays). We here tested the light- and cell type-specific responses of three selected histone acetylations and two histone methylations on five additional C4 genes (C4-Ca, C4-Ppdk, C4-Me, C4-Pepck, and C4-RbcS2) in maize. Histone acetylation and nucleosome occupancy assays indicated extended promoter regions with regulatory upstream regions more than 1,000 bp from the transcription initiation site for most of these genes. Despite any detectable homology of the promoters on the primary sequence level, histone modification patterns were highly coregulated. Specifically, H3K9ac was regulated by illumination, whereas H3K4me3 was regulated in a cell type-specific manner. We further compared histone modifications on the C4-Pepc and C4-Me genes from maize and the homologous genes from sorghum (Sorghum bicolor) and Setaria italica. Whereas sorghum and maize share a common C4 origin, C4 metabolism evolved independently in S. italica. The distribution of histone modifications over the promoters differed between the species, but differential regulation of light-induced histone acetylation and cell type-specific histone methylation were evident in all three species. We propose that a preexisting histone code was recruited into C4 promoter control during the evolution of C4 metabolism.

  17. Telomeric expression sites are highly conserved in Trypanosoma brucei.

    Directory of Open Access Journals (Sweden)

    Christiane Hertz-Fowler

    Full Text Available Subtelomeric regions are often under-represented in genome sequences of eukaryotes. One of the best known examples of the use of telomere proximity for adaptive purposes are the bloodstream expression sites (BESs of the African trypanosome Trypanosoma brucei. To enhance our understanding of BES structure and function in host adaptation and immune evasion, the BES repertoire from the Lister 427 strain of T. brucei were independently tagged and sequenced. BESs are polymorphic in size and structure but reveal a surprisingly conserved architecture in the context of extensive recombination. Very small BESs do exist and many functioning BESs do not contain the full complement of expression site associated genes (ESAGs. The consequences of duplicated or missing ESAGs, including ESAG9, a newly named ESAG12, and additional variant surface glycoprotein genes (VSGs were evaluated by functional assays after BESs were tagged with a drug-resistance gene. Phylogenetic analysis of constituent ESAG families suggests that BESs are sequence mosaics and that extensive recombination has shaped the evolution of the BES repertoire. This work opens important perspectives in understanding the molecular mechanisms of antigenic variation, a widely used strategy for immune evasion in pathogens, and telomere biology.

  18. Radioactivities evaluation code system for high temperature gas cooled reactors during normal operation

    International Nuclear Information System (INIS)

    Ogura, Kenji; Morimoto, Toshio; Suzuki, Katsuo.

    1979-01-01

    A radioactivity evaluation code system for high temperature gas-cooled reactors during normal operation was developed to study the behavior of fission products (FP) in the plants. The system consists of a code for the calculation of diffusion of FPs in fuel (FIPERX), a code for the deposition of FPs in primary cooling system (PLATO), a code for the transfer and emission of FPs in nuclear power plants (FIPPI-2), and a code for the exposure dose due to emitted FPs (FEDOSE). The FIPERX code can calculate the changes in the course of time FP of the distribution of FP concentration, the distribution of FP flow, the distribution of FP partial pressure, and the emission rate of FP into coolant. The amount of deposition of FPs and their distribution in primary cooling system can be evaluated by the PLATO code. The FIPPI-2 code can be used for the estimation of the amount of FPs in nuclear power plants and the amount of emitted FPs from the plants. The exposure dose of residents around nuclear power plants in case of the operation of the plants is calculated by the FEDOSE code. This code evaluates the dose due to the external exposure in the normal operation and in the accident, and the internal dose by the inhalation of radioactive plume and foods. Further studies of this code system by the comparison with the experimental data are considered. (Kato, T.)

  19. Thermal-hydraulic code selection for modular high temperature gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Komen, E M.J.; Bogaard, J.P.A. van den

    1995-06-01

    In order to study the transient thermal-hydraulic system behaviour of modular high temperature gas-cooled reactors, the thermal-hydraulic computer codes RELAP5, MELCOR, THATCH, MORECA, and VSOP are considered at the Netherlands Energy Research Foundation ECN. This report presents the selection of the most appropriate codes. To cover the range of relevant accidents, a suite of three codes is recommended for analyses of HTR-M and MHTGR reactors. (orig.).

  20. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  1. Homologous high-throughput expression and purification of highly conserved E coli proteins

    Directory of Open Access Journals (Sweden)

    Duchmann Rainer

    2007-06-01

    Full Text Available Abstract Background Genetic factors and a dysregulated immune response towards commensal bacteria contribute to the pathogenesis of Inflammatory Bowel Disease (IBD. Animal models demonstrated that the normal intestinal flora is crucial for the development of intestinal inflammation. However, due to the complexity of the intestinal flora, it has been difficult to design experiments for detection of proinflammatory bacterial antigen(s involved in the pathogenesis of the disease. Several studies indicated a potential association of E. coli with IBD. In addition, T cell clones of IBD patients were shown to cross react towards antigens from different enteric bacterial species and thus likely responded to conserved bacterial antigens. We therefore chose highly conserved E. coli proteins as candidate antigens for abnormal T cell responses in IBD and used high-throughput techniques for cloning, expression and purification under native conditions of a set of 271 conserved E. coli proteins for downstream immunologic studies. Results As a standardized procedure, genes were PCR amplified and cloned into the expression vector pQTEV2 in order to express proteins N-terminally fused to a seven-histidine-tag. Initial small-scale expression and purification under native conditions by metal chelate affinity chromatography indicated that the vast majority of target proteins were purified in high yields. Targets that revealed low yields after purification probably due to weak solubility were shuttled into Gateway (Invitrogen destination vectors in order to enhance solubility by N-terminal fusion of maltose binding protein (MBP, N-utilizing substance A (NusA, or glutathione S-transferase (GST to the target protein. In addition, recombinant proteins were treated with polymyxin B coated magnetic beads in order to remove lipopolysaccharide (LPS. Thus, 73% of the targeted proteins could be expressed and purified in large-scale to give soluble proteins in the range of 500

  2. Least reliable bits coding (LRBC) for high data rate satellite communications

    Science.gov (United States)

    Vanderaar, Mark; Budinger, James; Wagner, Paul

    1992-01-01

    LRBC, a bandwidth efficient multilevel/multistage block-coded modulation technique, is analyzed. LRBC uses simple multilevel component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Soft-decision multistage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Analytical expressions and tight performance bounds are used to show that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of BPSK. The relative simplicity of Galois field algebra vs the Viterbi algorithm and the availability of high-speed commercial VLSI for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.

  3. Low Complexity Encoder of High Rate Irregular QC-LDPC Codes for Partial Response Channels

    Directory of Open Access Journals (Sweden)

    IMTAWIL, V.

    2011-11-01

    Full Text Available High rate irregular QC-LDPC codes based on circulant permutation matrices, for efficient encoder implementation, are proposed in this article. The structure of the code is an approximate lower triangular matrix. In addition, we present two novel efficient encoding techniques for generating redundant bits. The complexity of the encoder implementation depends on the number of parity bits of the code for the one-stage encoding and the length of the code for the two-stage encoding. The advantage of both encoding techniques is that few XOR-gates are used in the encoder implementation. Simulation results on partial response channels also show that the BER performance of the proposed code has gain over other QC-LDPC codes.

  4. High-Speed Turbo-TCM-Coded Orthogonal Frequency-Division Multiplexing Ultra-Wideband Systems

    Directory of Open Access Journals (Sweden)

    Wang Yanxia

    2006-01-01

    Full Text Available One of the UWB proposals in the IEEE P802.15 WPAN project is to use a multiband orthogonal frequency-division multiplexing (OFDM system and punctured convolutional codes for UWB channels supporting a data rate up to 480 Mbps. In this paper, we improve the proposed system using turbo TCM with QAM constellation for higher data rate transmission. We construct a punctured parity-concatenated trellis codes, in which a TCM code is used as the inner code and a simple parity-check code is employed as the outer code. The result shows that the system can offer a much higher spectral efficiency, for example, 1.2 Gbps, which is 2.5 times higher than the proposed system. We identify several essential requirements to achieve the high rate transmission, for example, frequency and time diversity and multilevel error protection. Results are confirmed by density evolution.

  5. Visualization of conserved structures by fusing highly variable datasets.

    Science.gov (United States)

    Silverstein, Jonathan C; Chhadia, Ankur; Dech, Fred

    2002-01-01

    Reality (VR) environment. The accuracy of the fusions was determined qualitatively by comparing the transformed atlas overlaid on the appropriate CT. It was examined for where the transformed structure atlas was incorrectly overlaid (false positive) and where it was incorrectly not overlaid (false negative). According to this method, fusions 1 and 2 were correct roughly 50-75% of the time, while fusions 3 and 4 were correct roughly 75-100%. The CT dataset augmented with transformed dataset was viewed arbitrarily in user-centered perspective stereo taking advantage of features such as scaling, windowing and volumetric region of interest selection. This process of auto-coloring conserved structures in variable datasets is a step toward the goal of a broader, standardized automatic structure visualization method for radiological data. If successful it would permit identification, visualization or deletion of structures in radiological data by semi-automatically applying canonical structure information to the radiological data (not just processing and visualization of the data's intrinsic dynamic range). More sophisticated selection of control points and patterns of warping may allow for more accurate transforms, and thus advances in visualization, simulation, education, diagnostics, and treatment planning.

  6. High fidelity analysis of BWR fuel assembly with COBRA-TF/PARCS and trace codes

    International Nuclear Information System (INIS)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.; Soler, A.

    2013-01-01

    The growing importance of detailed reactor core and fuel assembly description for light water reactors (LWRs) as well as the sub-channel safety analysis requires high fidelity models and coupled neutronic/thermalhydraulic codes. Hand in hand with advances in the computer technology, the nuclear safety analysis is beginning to use a more detailed thermal hydraulics and neutronics. Previously, a PWR core and a 16 by 16 fuel assembly models were developed to test and validate our COBRA-TF/PARCS v2.7 (CTF/PARCS) coupled code. In this work, a comparison of the modeling and simulation advantages and disadvantages of modern 10 by 10 BWR fuel assembly with CTF/PARCS and TRACE codes has been done. The objective of the comparison is making known the main advantages of using the sub-channel codes to perform high resolution nuclear safety analysis. The sub-channel codes, like CTF, permits obtain accurate predictions, in two flow regime, of the thermalhydraulic parameters important to safety with high local resolution. The modeled BWR fuel assembly has 91 fuel rods (81 full length and 10 partial length fuel rods) and a big square central water rod. This assembly has been modeled with high level of detail with CTF code and using the BWR modeling parameters provided by TRACE. The same neutronic PARCS's model has been used for the simulation with both codes. To compare the codes a coupled steady state has be performed. (author)

  7. Quantum secure direct communication with high-dimension quantum superdense coding

    International Nuclear Information System (INIS)

    Wang Chuan; Li Yansong; Liu Xiaoshu; Deng Fuguo; Long Guilu

    2005-01-01

    A protocol for quantum secure direct communication with quantum superdense coding is proposed. It combines the ideas of block transmission, the ping-pong quantum secure direct communication protocol, and quantum superdense coding. It has the advantage of being secure and of high source capacity

  8. Preliminary application of the draft code case for alloy 617 for a high temperature component

    International Nuclear Information System (INIS)

    Lee, Hyeong Yeon; Kim, Yong Wan; Song, Kee Nam

    2008-01-01

    The ASME draft Code Case for Alloy 617 was developed in the late 1980s for the design of very-high-temperature gas cooled reactors. The draft Code Case was patterned after the ASME Code Section III Subsection NH and was intended to cover Ni-Cr-Co-Mo Alloy 617 to 982 .deg. C (1800 .deg. F). But the draft Code Case is still in an incomplete status, lacking necessary material properties and design data. In this study, a preliminary evaluation on the creep-fatigue damage for a high temperature hot duct pipe structure has been carried out according to the draft Code Case. The evaluation procedures and results according to the draft Code Case for Alloy 617 material were compared with those of the ASME Subsection NH and RCC-MR for Alloy 800H material. It was shown that many data including material properties, fatigue and creep data should be supplemented for the draft Code Case. However, when the evaluation results on the creep-fatigue damage according to the draft Code Case, ASME-NH and RCC-MR were compared based on the preliminary evaluation, it was shown that the Alloy 617 results from the draft Code Case tended to be more resistant to the creep damage while less resistant to the fatigue damage than those from the ASME-NH and RCC-MR

  9. CIPHER: coded imager and polarimeter for high-energy radiation

    CERN Document Server

    Caroli, E; Dusi, W; Bertuccio, G; Sampietro, M

    2000-01-01

    The CIPHER instrument is a hard X- and soft gamma-ray spectroscopic and polarimetric coded mask imager based on an array of cadmium telluride micro-spectrometers. The position-sensitive detector (PSD) will be arranged in 4 modules of 32x32 crystals, each of 2x2 mm sup 2 cross section and 10 mm thickness giving a total active area of about 160 cm sup 2. The micro-spectrometer characteristics allow a wide operating range from approx 10 keV to 1 MeV, while the PSD is actively shielded by CsI crystals on the bottom in order to reduce background. The mask, based on a modified uniformly redundant array (MURA) pattern, is four times the area of the PSD and is situated at about 100 cm from the CdTe array top surface. The CIPHER instrument is proposed for a balloon experiment, both in order to assess the performance of such an instrumental concept for a small/medium-size satellite survey mission and to perform an innovative measurement of the Crab polarisation level. The CIPHER's field of view allows the instrument to...

  10. Energy conservation according to the building codes of the National Board of Housing, Building and Planning; Energihushaallning enligt Boverkets byggregler

    Energy Technology Data Exchange (ETDEWEB)

    2009-10-15

    To comply with international and national targets for energy use, the National Board has adopted rules setting the levels to be met in order to conserve energy in buildings. The rules for buildings are shown in Boverkets building regulations (BBR). The BBR lists comprehensive requirements in order to ensure that a building must not use more than a certain number of kilowatt hours per square meter and year. There are more detailed requirements for thermal insulation, heating, cooling and air conditioning installations, efficient use of electricity and installation of metering systems for monitoring of building energy. The latest version of the BBR came into force on February 1, 2009 and has more stringent requirements for the buildings heated by electricity or comfort cooling powered by electricity. This handbook presents comments and answers to questions about the new rules for energy conservation. It replaces our previous handbook 'Thermal calculations'

  11. Comparative analyses of six solanaceous transcriptomes reveal a high degree of sequence conservation and species-specific transcripts

    Directory of Open Access Journals (Sweden)

    Ouyang Shu

    2005-09-01

    Full Text Available Abstract Background The Solanaceae is a family of closely related species with diverse phenotypes that have been exploited for agronomic purposes. Previous studies involving a small number of genes suggested sequence conservation across the Solanaceae. The availability of large collections of Expressed Sequence Tags (ESTs for the Solanaceae now provides the opportunity to assess sequence conservation and divergence on a genomic scale. Results All available ESTs and Expressed Transcripts (ETs, 449,224 sequences for six Solanaceae species (potato, tomato, pepper, petunia, tobacco and Nicotiana benthamiana, were clustered and assembled into gene indices. Examination of gene ontologies revealed that the transcripts within the gene indices encode a similar suite of biological processes. Although the ESTs and ETs were derived from a variety of tissues, 55–81% of the sequences had significant similarity at the nucleotide level with sequences among the six species. Putative orthologs could be identified for 28–58% of the sequences. This high degree of sequence conservation was supported by expression profiling using heterologous hybridizations to potato cDNA arrays that showed similar expression patterns in mature leaves for all six solanaceous species. 16–19% of the transcripts within the six Solanaceae gene indices did not have matches among Solanaceae, Arabidopsis, rice or 21 other plant gene indices. Conclusion Results from this genome scale analysis confirmed a high level of sequence conservation at the nucleotide level of the coding sequence among Solanaceae. Additionally, the results indicated that part of the Solanaceae transcriptome is likely to be unique for each species.

  12. Analytic, high β, flux conserving equilibria for cylindrical tokamaks

    International Nuclear Information System (INIS)

    Sigmar, D.J.; Vahala, G.

    1978-09-01

    Using Grad's theory of generalized differential equations, the temporal evolution from low to high β due to ''adiabatic'' and nonadiabatic (i.e., neutral beam injection) heating of a cylindrical tokamak plasma with circular cross section and peaked current profiles is calculated analytically. The influence of shaping the initial safety factor profile and the beam deposition profile and the effect of minor radius compression on the equilibrium is analyzed

  13. Analytic, high β, flux conserving equilibria for cylindrical tokamaks

    International Nuclear Information System (INIS)

    Sigmar, D.J.; Vahala, G.

    1978-01-01

    Using Grad's theory of generalized differential equations, the temporal evolution from low to high β due to ''adiabatic'' and nonadiabatic (i.e., neutral beam injection) heating of a cylindrical tokamak plasma with circular cross section and peaked current profiles is calculated analytically. The influence of shaping the initial safety factor profile and the beam deposition profile and the effect of minor radius compression on the equilibrium is analyzed

  14. Codes of Ethics and the High School Newspaper: Part One.

    Science.gov (United States)

    Hager, Marilyn

    1978-01-01

    Deals with two types of ethical problems encountered by journalists, including high school journalists: deciding whether to accept gifts and favors from advertisers and news sources, and deciding what types of language would be offensive to readers. (GT)

  15. Structure of genes for dermaseptins B, antimicrobial peptides from frog skin. Exon 1-encoded prepropeptide is conserved in genes for peptides of highly different structures and activities.

    Science.gov (United States)

    Vouille, V; Amiche, M; Nicolas, P

    1997-09-01

    We cloned the genes of two members of the dermaseptin family, broad-spectrum antimicrobial peptides isolated from the skin of the arboreal frog Phyllomedusa bicolor. The dermaseptin gene Drg2 has a 2-exon coding structure interrupted by a small 137-bp intron, wherein exon 1 encoded a 22-residue hydrophobic signal peptide and the first three amino acids of the acidic propiece; exon 2 contained the 18 additional acidic residues of the propiece plus a typical prohormone processing signal Lys-Arg and a 32-residue dermaseptin progenitor sequence. The dermaseptin genes Drg2 and Drg1g2 have conserved sequences at both untranslated ends and in the first and second coding exons. In contrast, Drg1g2 comprises a third coding exon for a short version of the acidic propiece and a second dermaseptin progenitor sequence. Structural conservation between the two genes suggests that Drg1g2 arose recently from an ancestral Drg2-like gene through amplification of part of the second coding exon and 3'-untranslated region. Analysis of the cDNAs coding precursors for several frog skin peptides of highly different structures and activities demonstrates that the signal peptides and part of the acidic propieces are encoded by conserved nucleotides encompassed by the first coding exon of the dermaseptin genes. The organization of the genes that belong to this family, with the signal peptide and the progenitor sequence on separate exons, permits strikingly different peptides to be directed into the secretory pathway. The recruitment of such a homologous 'secretory' exon by otherwise non-homologous genes may have been an early event in the evolution of amphibian.

  16. High-resolution satellite imagery is an important yet underutilized resource in conservation biology.

    Science.gov (United States)

    Boyle, Sarah A; Kennedy, Christina M; Torres, Julio; Colman, Karen; Pérez-Estigarribia, Pastor E; de la Sancha, Noé U

    2014-01-01

    Technological advances and increasing availability of high-resolution satellite imagery offer the potential for more accurate land cover classifications and pattern analyses, which could greatly improve the detection and quantification of land cover change for conservation. Such remotely-sensed products, however, are often expensive and difficult to acquire, which prohibits or reduces their use. We tested whether imagery of high spatial resolution (≤5 m) differs from lower-resolution imagery (≥30 m) in performance and extent of use for conservation applications. To assess performance, we classified land cover in a heterogeneous region of Interior Atlantic Forest in Paraguay, which has undergone recent and dramatic human-induced habitat loss and fragmentation. We used 4 m multispectral IKONOS and 30 m multispectral Landsat imagery and determined the extent to which resolution influenced the delineation of land cover classes and patch-level metrics. Higher-resolution imagery more accurately delineated cover classes, identified smaller patches, retained patch shape, and detected narrower, linear patches. To assess extent of use, we surveyed three conservation journals (Biological Conservation, Biotropica, Conservation Biology) and found limited application of high-resolution imagery in research, with only 26.8% of land cover studies analyzing satellite imagery, and of these studies only 10.4% used imagery ≤5 m resolution. Our results suggest that high-resolution imagery is warranted yet under-utilized in conservation research, but is needed to adequately monitor and evaluate forest loss and conversion, and to delineate potentially important stepping-stone fragments that may serve as corridors in a human-modified landscape. Greater access to low-cost, multiband, high-resolution satellite imagery would therefore greatly facilitate conservation management and decision-making.

  17. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  18. A simulation of driven reconnection by a high precision MHD code

    International Nuclear Information System (INIS)

    Kusano, Kanya; Ouchi, Yasuo; Hayashi, Takaya; Horiuchi, Ritoku; Watanabe, Kunihiko; Sato, Tetsuya.

    1988-01-01

    A high precision MHD code, which has the fourth-order accuracy for both the spatial and time steps, is developed, and is applied to the simulation studies of two dimensional driven reconnection. It is confirm that the numerical dissipation of this new scheme is much less than that of two-step Lax-Wendroff scheme. The effect of the plasma compressibility on the reconnection dynamics is investigated by means of this high precision code. (author)

  19. High-speed low-complexity video coding with EDiCTius: a DCT coding proposal for JPEG XS

    Science.gov (United States)

    Richter, Thomas; Fößel, Siegfried; Keinert, Joachim; Scherl, Christian

    2017-09-01

    In its 71th meeting, the JPEG committee issued a call for low complexity, high speed image coding, designed to address the needs of low-cost video-over-ip applications. As an answer to this call, Fraunhofer IIS and the Computing Center of the University of Stuttgart jointly developed an embedded DCT image codec requiring only minimal resources while maximizing throughput on FPGA and GPU implementations. Objective and subjective tests performed for the 73rd meeting confirmed its excellent performance and suitability for its purpose, and it was selected as one of the two key contributions for the development of a joined test model. In this paper, its authors describe the design principles of the codec, provide a high-level overview of the encoder and decoder chain and provide evaluation results on the test corpus selected by the JPEG committee.

  20. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  1. LABAN-PEL: a two-dimensional, multigroup diffusion, high-order response matrix code

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-06-01

    The capabilities of LABAN-PEL is described. LABAN-PEL is a modified version of the two-dimensional, high-order response matrix code, LABAN, written by Lindahl. The new version extends the capabilities of the original code with regard to the treatment of neutron migration by including an option to utilize full group-to-group diffusion coefficient matrices. In addition, the code has been converted from single to double precision and the necessary routines added to activate its multigroup capability. The coding has also been converted to standard FORTRAN-77 to enhance the portability of the code. Details regarding the input data requirements and calculational options of LABAN-PEL are provided. 13 refs

  2. Primary structure and promoter analysis of leghemoglobin genes of the stem-nodulated tropical legume Sesbania rostrata: conserved coding sequences, cis-elements and trans-acting factors

    DEFF Research Database (Denmark)

    Metz, B A; Welters, P; Hoffmann, H J

    1988-01-01

    The primary structure of a leghemoglobin (lb) gene from the stem-nodulated, tropical legume Sesbania rostrata and two lb gene promoter regions was analysed. The S. rostrata lb gene structure and Lb amino acid composition were found to be highly conserved with previously described lb genes and Lb ...

  3. An environment for high energy physics code development

    International Nuclear Information System (INIS)

    Wisinski, D.E.

    1987-01-01

    As the size and complexity of high energy experiments increase there will be a greater need for better software tools and new programming environments. If these are not commercially available, then we must build them ourselves. This paper describes a prototype programming environment featuring a new type of file system, a ''smart'' editor, and integrated file management tools. This environment was constructed under the IBM VM/SP operating system. It uses the system interpreter, the system editor and the NOMAD2 relational database management system to create a software ''shell'' for the programmer. Some extensions to this environment are explored. (orig.)

  4. Quo vadis code optimization in high energy physics

    International Nuclear Information System (INIS)

    Jarp, S.

    1994-01-01

    Although performance tuning and optimization can be considered less critical than in the past, there are still many High Energy Physics (HEP) applications and application domains that can profit from such an undertaking. In CERN's CORE (Centrally Operated RISC Environment) where all major RISC vendors are present, this implies an understanding of the various computer architectures, instruction sets and performance analysis tools from each of these vendors. This paper discusses some initial observations after having evaluated the situation and makes some recommendations for further progress

  5. Energy efficient rateless codes for high speed data transfer over free space optical channels

    Science.gov (United States)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  6. A high-resolution code for large eddy simulation of incompressible turbulent boundary layer flows

    KAUST Repository

    Cheng, Wan

    2014-03-01

    We describe a framework for large eddy simulation (LES) of incompressible turbulent boundary layers over a flat plate. This framework uses a fractional-step method with fourth-order finite difference on a staggered mesh. We present several laminar examples to establish the fourth-order accuracy and energy conservation property of the code. Furthermore, we implement a recycling method to generate turbulent inflow. We use the stretched spiral vortex subgrid-scale model and virtual wall model to simulate the turbulent boundary layer flow. We find that the case with Reθ ≈ 2.5 × 105 agrees well with available experimental measurements of wall friction, streamwise velocity profiles and turbulent intensities. We demonstrate that for cases with extremely large Reynolds numbers (Reθ = 1012), the present LES can reasonably predict the flow with a coarse mesh. The parallel implementation of the LES code demonstrates reasonable scaling on O(103) cores. © 2013 Elsevier Ltd.

  7. Status of design code work for metallic high temperature components

    International Nuclear Information System (INIS)

    Bieniussa, K.; Seehafer, H.J.; Over, H.H.; Hughes, P.

    1984-01-01

    The mechanical components of high temperature gas-cooled reactors, HTGR, are exposed to temperatures up to about 1000 deg. C and this in a more or less corrosive gas environment. Under these conditions metallic structural materials show a time-dependent structural behavior. Furthermore changes in the structure of the material and loss of material in the surface can result. The structural material of the components will be stressed originating from load-controlled quantities, for example pressure or dead weight, and/or deformation-controlled quantities, for example thermal expansion or temperature distribution, and thus it can suffer rowing permanent strains and deformations and an exhaustion of the material (damage) both followed by failure. To avoid a failure of the components the design requires the consideration of the following structural failure modes: ductile rupture due to short-term loadings; creep rupture due to long-term loadings; reep-fatigue failure due to cyclic loadings excessive strains due to incremental deformation or creep ratcheting; loss of function due to excessive deformations; loss of stability due to short-term loadings; loss of stability due to long-term loadings; environmentally caused material failure (excessive corrosion); fast fracture due to instable crack growth

  8. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    Science.gov (United States)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  9. High-Capacity Quantum Secure Direct Communication Based on Quantum Hyperdense Coding with Hyperentanglement

    International Nuclear Information System (INIS)

    Wang Tie-Jun; Li Tao; Du Fang-Fang; Deng Fu-Guo

    2011-01-01

    We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication (QSDC) protocol based on this quantum hyperdense coding protocol. This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system. Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present. (general)

  10. HETFIS: High-Energy Nucleon-Meson Transport Code with Fission

    International Nuclear Information System (INIS)

    Barish, J.; Gabriel, T.A.; Alsmiller, F.S.; Alsmiller, R.G. Jr.

    1981-07-01

    A model that includes fission for predicting particle production spectra from medium-energy nucleon and pion collisions with nuclei (Z greater than or equal to 91) has been incorporated into the nucleon-meson transport code, HETC. This report is primarily concerned with the programming aspects of HETFIS (High-Energy Nucleon-Meson Transport Code with Fission). A description of the program data and instructions for operating the code are given. HETFIS is written in FORTRAN IV for the IBM computers and is readily adaptable to other systems

  11. High-radix transforms for Reed-Solomon codes over Fermat primes

    Science.gov (United States)

    Liu, K. Y.; Reed, I. S.; Truong, T. K.

    1977-01-01

    A method is proposed to streamline the transform decoding algorithm for Reed-Solomon (RS) codes of length equal to 2 raised to the power 2n. It is shown that a high-radix fast Fourier transform (FFT) type algorithm with generator equal to 3 on GF(F sub n), where F sub n is a Fermat prime, can be used to decode RS codes of this length. For a 256-symbol RS code, a radix 4 and radix 16 FFT over GF(F sub 3) require, respectively, 30 and 70% fewer modulo F sub n multiplications than the usual radix 2 FFT.

  12. Detonation of high explosives in Lagrangian hydrodynamic codes using the programmed burn technique

    International Nuclear Information System (INIS)

    Berger, M.E.

    1975-09-01

    Two initiation methods were developed for improving the programmed burn technique for detonation of high explosives in smeared-shock Lagrangian hydrodynamic codes. The methods are verified by comparing the improved programmed burn with existing solutions in one-dimensional plane, converging, and diverging geometries. Deficiencies in the standard programmed burn are described. One of the initiation methods has been determined to be better for inclusion in production hydrodynamic codes

  13. Spatial overlap between environmental policy instruments and areas of high conservation value in forest.

    Science.gov (United States)

    Sverdrup-Thygeson, Anne; Søgaard, Gunnhild; Rusch, Graciela M; Barton, David N

    2014-01-01

    In order to safeguard biodiversity in forest we need to know how forest policy instruments work. Here we use a nationwide network of 9400 plots in productive forest to analyze to what extent large-scale policy instruments, individually and together, target forest of high conservation value in Norway. We studied both instruments working through direct regulation; Strict Protection and Landscape Protection, and instruments working through management planning and voluntary schemes of forest certification; Wilderness Area and Mountain Forest. As forest of high conservation value (HCV-forest) we considered the extent of 12 Biodiversity Habitats and the extent of Old-Age Forest. We found that 22% of productive forest area contained Biodiversity Habitats. More than 70% of this area was not covered by any large-scale instruments. Mountain Forest covered 23%, while Strict Protection and Wilderness both covered 5% of the Biodiversity Habitat area. A total of 9% of productive forest area contained Old-Age Forest, and the relative coverage of the four instruments was similar as for Biodiversity Habitats. For all instruments, except Landscape Protection, the targeted areas contained significantly higher proportions of HCV-forest than areas not targeted by these instruments. Areas targeted by Strict Protection had higher proportions of HCV-forest than areas targeted by other instruments, except for areas targeted by Wilderness Area which showed similar proportions of Biodiversity Habitats. There was a substantial amount of spatial overlap between the policy tools, but no incremental conservation effect of overlapping instruments in terms of contributing to higher percentages of targeted HCV-forest. Our results reveal that although the current policy mix has an above average representation of forest of high conservation value, the targeting efficiency in terms of area overlap is limited. There is a need to improve forest conservation and a potential to cover this need by better

  14. On RELAP5-simulated High Flux Isotope Reactor reactivity transients: Code change and application

    International Nuclear Information System (INIS)

    Freels, J.D.

    1993-01-01

    This paper presents a new and innovative application for the RELAP5 code (hereafter referred to as ''the code''). The code has been used to simulate several transients associated with the (presently) draft version of the High-Flux Isotope Reactor (HFIR) updated safety analysis report (SAR). This paper investigates those thermal-hydraulic transients induced by nuclear reactivity changes. A major goal of the work was to use an existing RELAP5 HFIR model for consistency with other thermal-hydraulic transient analyses of the SAR. To achieve this goal, it was necessary to incorporate a new self-contained point kinetics solver into the code because of a deficiency in the point-kinetics reactivity model of the Mod 2.5 version of the code. The model was benchmarked against previously analyzed (known) transients. Given this new code, four event categories defined by the HFIR probabilistic risk assessment (PRA) were analyzed: (in ascending order of severity) a cold-loop pump start; run-away shim-regulating control cylinder and safety plate withdrawal; control cylinder ejection; and generation of an optimum void in the target region. All transients are discussed. Results of the bounding incredible event transient, the target region optimum void, are shown. Future plans for RELAP5 HFIR applications and recommendations for code improvements are also discussed

  15. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  16. A new relativistic viscous hydrodynamics code and its application to the Kelvin-Helmholtz instability in high-energy heavy-ion collisions

    Energy Technology Data Exchange (ETDEWEB)

    Okamoto, Kazuhisa [Nagoya University, Department of Physics, Nagoya (Japan); Nonaka, Chiho [Nagoya University, Department of Physics, Nagoya (Japan); Nagoya University, Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI), Nagoya (Japan); Duke University, Department of Physics, Durham, NC (United States)

    2017-06-15

    We construct a new relativistic viscous hydrodynamics code optimized in the Milne coordinates. We split the conservation equations into an ideal part and a viscous part, using the Strang spitting method. In the code a Riemann solver based on the two-shock approximation is utilized for the ideal part and the Piecewise Exact Solution (PES) method is applied for the viscous part. We check the validity of our numerical calculations by comparing analytical solutions, the viscous Bjorken's flow and the Israel-Stewart theory in Gubser flow regime. Using the code, we discuss possible development of the Kelvin-Helmholtz instability in high-energy heavy-ion collisions. (orig.)

  17. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  18. Prospects for Parity Non-conservation Experiments with Highly Charged Heavy Ions

    OpenAIRE

    Maul, M.; Schäfer, A.; Greiner, W.; Indelicato, P.

    1996-01-01

    We discuss the prospects for parity non-conservation experiments with highly charged heavy ions. Energy levels and parity mixing for heavy ions with two to five electrons are calculated. We investigate two-photon-transitions and the possibility to observe interference effects between weak-matrix elements and Stark matrix elements for periodic electric field configurations.

  19. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    Science.gov (United States)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  20. A Character Segmentation Proposal for High-Speed Visual Monitoring of Expiration Codes on Beverage Cans

    Directory of Open Access Journals (Sweden)

    José C. Rodríguez-Rodríguez

    2016-04-01

    Full Text Available Expiration date labels are ubiquitous in the food industry. With the passage of time, almost any food becomes unhealthy, even when well preserved. The expiration date is estimated based on the type and manufacture/packaging time of that particular food unit. This date is then printed on the container so it is available to the end user at the time of consumption. MONICOD (MONItoring of CODes; an industrial validator of expiration codes; allows the expiration code printed on a drink can to be read. This verification occurs immediately after printing. MONICOD faces difficulties due to the high printing rate (35 cans per second and problematic lighting caused by the metallic surface on which the code is printed. This article describes a solution that allows MONICOD to extract shapes and presents quantitative results for the speed and quality.

  1. Conservation of σ28-Dependent Non-Coding RNA Paralogs and Predicted σ54-Dependent Targets in Thermophilic Campylobacter Species.

    Directory of Open Access Journals (Sweden)

    My Thanh Le

    Full Text Available Assembly of flagella requires strict hierarchical and temporal control via flagellar sigma and anti-sigma factors, regulatory proteins and the assembly complex itself, but to date non-coding RNAs (ncRNAs have not been described to regulate genes directly involved in flagellar assembly. In this study we have investigated the possible role of two ncRNA paralogs (CjNC1, CjNC4 in flagellar assembly and gene regulation of the diarrhoeal pathogen Campylobacter jejuni. CjNC1 and CjNC4 are 37/44 nt identical and predicted to target the 5' untranslated region (5' UTR of genes transcribed from the flagellar sigma factor σ54. Orthologs of the σ54-dependent 5' UTRs and ncRNAs are present in the genomes of other thermophilic Campylobacter species, and transcription of CjNC1 and CNC4 is dependent on the flagellar sigma factor σ28. Surprisingly, inactivation and overexpression of CjNC1 and CjNC4 did not affect growth, motility or flagella-associated phenotypes such as autoagglutination. However, CjNC1 and CjNC4 were able to mediate sequence-dependent, but Hfq-independent, partial repression of fluorescence of predicted target 5' UTRs in an Escherichia coli-based GFP reporter gene system. This hints towards a subtle role for the CjNC1 and CjNC4 ncRNAs in post-transcriptional gene regulation in thermophilic Campylobacter species, and suggests that the currently used phenotypic methodologies are insufficiently sensitive to detect such subtle phenotypes. The lack of a role of Hfq in the E. coli GFP-based system indicates that the CjNC1 and CjNC4 ncRNAs may mediate post-transcriptional gene regulation in ways that do not conform to the paradigms obtained from the Enterobacteriaceae.

  2. Conservation of σ28-Dependent Non-Coding RNA Paralogs and Predicted σ54-Dependent Targets in Thermophilic Campylobacter Species

    Science.gov (United States)

    Le, My Thanh; van Veldhuizen, Mart; Porcelli, Ida; Bongaerts, Roy J.; Gaskin, Duncan J. H.; Pearson, Bruce M.; van Vliet, Arnoud H. M.

    2015-01-01

    Assembly of flagella requires strict hierarchical and temporal control via flagellar sigma and anti-sigma factors, regulatory proteins and the assembly complex itself, but to date non-coding RNAs (ncRNAs) have not been described to regulate genes directly involved in flagellar assembly. In this study we have investigated the possible role of two ncRNA paralogs (CjNC1, CjNC4) in flagellar assembly and gene regulation of the diarrhoeal pathogen Campylobacter jejuni. CjNC1 and CjNC4 are 37/44 nt identical and predicted to target the 5' untranslated region (5' UTR) of genes transcribed from the flagellar sigma factor σ54. Orthologs of the σ54-dependent 5' UTRs and ncRNAs are present in the genomes of other thermophilic Campylobacter species, and transcription of CjNC1 and CNC4 is dependent on the flagellar sigma factor σ28. Surprisingly, inactivation and overexpression of CjNC1 and CjNC4 did not affect growth, motility or flagella-associated phenotypes such as autoagglutination. However, CjNC1 and CjNC4 were able to mediate sequence-dependent, but Hfq-independent, partial repression of fluorescence of predicted target 5' UTRs in an Escherichia coli-based GFP reporter gene system. This hints towards a subtle role for the CjNC1 and CjNC4 ncRNAs in post-transcriptional gene regulation in thermophilic Campylobacter species, and suggests that the currently used phenotypic methodologies are insufficiently sensitive to detect such subtle phenotypes. The lack of a role of Hfq in the E. coli GFP-based system indicates that the CjNC1 and CjNC4 ncRNAs may mediate post-transcriptional gene regulation in ways that do not conform to the paradigms obtained from the Enterobacteriaceae. PMID:26512728

  3. Balancing forest-regeneration probabilities and maintenance costs in dry grasslands of high conservation priority

    Science.gov (United States)

    Bolliger, Janine; Edwards, Thomas C.; Eggenberg, Stefan; Ismail, Sascha; Seidl, Irmi; Kienast, Felix

    2011-01-01

    Abandonment of agricultural land has resulted in forest regeneration in species-rich dry grasslands across European mountain regions and threatens conservation efforts in this vegetation type. To support national conservation strategies, we used a site-selection algorithm (MARXAN) to find optimum sets of floristic regions (reporting units) that contain grasslands of high conservation priority. We sought optimum sets that would accommodate 136 important dry-grassland species and that would minimize forest regeneration and costs of management needed to forestall predicted forest regeneration. We did not consider other conservation elements of dry grasslands, such as animal species richness, cultural heritage, and changes due to climate change. Optimal sets that included 95–100% of the dry grassland species encompassed an average of 56–59 floristic regions (standard deviation, SD 5). This is about 15% of approximately 400 floristic regions that contain dry-grassland sites and translates to 4800–5300 ha of dry grassland out of a total of approximately 23,000 ha for the entire study area. Projected costs to manage the grasslands in these optimum sets ranged from CHF (Swiss francs) 5.2 to 6.0 million/year. This is only 15–20% of the current total estimated cost of approximately CHF30–45 million/year required if all dry grasslands were to be protected. The grasslands of the optimal sets may be viewed as core sites in a national conservation strategy.

  4. Technical evaluation on high aging, and performance conditions on long-term conservation program

    International Nuclear Information System (INIS)

    Yamashita, Atsushi

    2001-01-01

    In order to secure safety and safe operation of power plants, in every nuclear power plants, conservation actions based on preventive conservation are performed. They contain operative condition monitoring, patrolling inspection, and periodical tests on important systems and apparatus by operators under plant operation and condition monitoring by maintenance workers, and so on, and when finding out their abnormal conditions, their detailed survey is performed to adopt adequate countermeasures such as recovery, exchange, and so on. And, to equipments for nuclear power generation periodical conditions were obliged by legal examinations and by independent inspections. As a result of these conservation actions, even on a plant elapsed about 30 years since beginning of its operation it was thought that the plant was aged with elapsing time even if not recognizing any indication on its aged deterioration at that time. Therefore, for its concrete countermeasure, by supposing long-term operation of a plant with longer operation history, some technical evaluation on aged phenomena were carried out, to investigate on reflection of the obtained results to present conservation actions. Here were described on efforts on the high aging countermeasures, and performing conditions of long-term conservation in the Tsuruga Unit No. 1 Nuclear Power Station. (G.K.)

  5. High qualitative and quantitative conservation of alternative splicing in Caenorhabditis elegans and Caenorhabditis briggsae

    DEFF Research Database (Denmark)

    Rukov, Jakob Lewin; Irimia, Manuel; Mørk, Søren

    2007-01-01

    Alternative splicing (AS) is an important contributor to proteome diversity and is regarded as an explanatory factor for the relatively low number of human genes compared with less complex animals. To assess the evolutionary conservation of AS and its developmental regulation, we have investigated...... the qualitative and quantitative expression of 21 orthologous alternative splice events through the development of 2 nematode species separated by 85-110 Myr of evolutionary time. We demonstrate that most of these alternative splice events present in Caenorhabditis elegans are conserved in Caenorhabditis briggsae....... Moreover, we find that relative isoform expression levels vary significantly during development for 78% of the AS events and that this quantitative variation is highly conserved between the 2 species. Our results suggest that AS is generally tightly regulated through development and that the regulatory...

  6. Developing a Coding Scheme to Analyse Creativity in Highly-constrained Design Activities

    DEFF Research Database (Denmark)

    Dekoninck, Elies; Yue, Huang; Howard, Thomas J.

    2010-01-01

    This work is part of a larger project which aims to investigate the nature of creativity and the effectiveness of creativity tools in highly-constrained design tasks. This paper presents the research where a coding scheme was developed and tested with a designer-researcher who conducted two rounds...... of design and analysis on a highly constrained design task. This paper shows how design changes can be coded using a scheme based on creative ‘modes of change’. The coding scheme can show the way a designer moves around the design space, and particularly the strategies that are used by a creative designer...... larger study with more designers working on different types of highly-constrained design task is needed, in order to draw conclusions on the modes of change and their relationship to creativity....

  7. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    Science.gov (United States)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to

  8. Updated tokamak systems code and applications to high-field ignition devices

    International Nuclear Information System (INIS)

    Reid, R.L.; Galambos, J.D.; Peng, Y-K.M.; Strickler, D.J.; Selcow, E.C.

    1985-01-01

    This paper describes revisions made to the Tokamak Systems Code to more accurately model high-field copper ignition devices. The major areas of revision were in the plasma physics model, the toroidal field (TF) coil model, and the poloidal field (PF) coil/MHD model. Also included in this paper are results obtained from applying the revised code to a study for a high-field copper ignition device to determine the impact of magnetic field on axis, (at the major radius), on performance, and on cost

  9. The fast decoding of Reed-Solomon codes using high-radix fermat theoretic transforms

    Science.gov (United States)

    Liu, K. Y.; Reed, I. S.; Truong, T. K.

    1976-01-01

    Fourier-like transforms over GF(F sub n), where F sub n = 2(2n) + 1 is a Fermat prime, are applied in decoding Reed-Solomon codes. It is shown that such transforms can be computed using high-radix fast Fourier transform (FFT) algorithms requiring considerably fewer multiplications than the more usual radix 2 FFT algorithm. A special 256-symbol, 16-symbol-error-correcting, Reed-Solomon (RS) code for space communication-link applications can be encoded and decoded using this high-radix FFT algorithm over GF(F sub 3).

  10. Genes involved in complex adaptive processes tend to have highly conserved upstream regions in mammalian genomes

    Directory of Open Access Journals (Sweden)

    Kohane Isaac

    2005-11-01

    Full Text Available Abstract Background Recent advances in genome sequencing suggest a remarkable conservation in gene content of mammalian organisms. The similarity in gene repertoire present in different organisms has increased interest in studying regulatory mechanisms of gene expression aimed at elucidating the differences in phenotypes. In particular, a proximal promoter region contains a large number of regulatory elements that control the expression of its downstream gene. Although many studies have focused on identification of these elements, a broader picture on the complexity of transcriptional regulation of different biological processes has not been addressed in mammals. The regulatory complexity may strongly correlate with gene function, as different evolutionary forces must act on the regulatory systems under different biological conditions. We investigate this hypothesis by comparing the conservation of promoters upstream of genes classified in different functional categories. Results By conducting a rank correlation analysis between functional annotation and upstream sequence alignment scores obtained by human-mouse and human-dog comparison, we found a significantly greater conservation of the upstream sequence of genes involved in development, cell communication, neural functions and signaling processes than those involved in more basic processes shared with unicellular organisms such as metabolism and ribosomal function. This observation persists after controlling for G+C content. Considering conservation as a functional signature, we hypothesize a higher density of cis-regulatory elements upstream of genes participating in complex and adaptive processes. Conclusion We identified a class of functions that are associated with either high or low promoter conservation in mammals. We detected a significant tendency that points to complex and adaptive processes were associated with higher promoter conservation, despite the fact that they have emerged

  11. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  12. Crystal structure of AFV3-109, a highly conserved protein from crenarchaeal viruses

    Directory of Open Access Journals (Sweden)

    Quevillon-Cheruel Sophie

    2007-01-01

    Full Text Available Abstract The extraordinary morphologies of viruses infecting hyperthermophilic archaea clearly distinguish them from bacterial and eukaryotic viruses. Moreover, their genomes code for proteins that to a large extend have no related sequences in the extent databases. However, a small pool of genes is shared by overlapping subsets of these viruses, and the most conserved gene, exemplified by the ORF109 of the Acidianus Filamentous Virus 3, AFV3, is present on genomes of members of three viral familes, the Lipothrixviridae, Rudiviridae, and "Bicaudaviridae", as well as of the unclassified Sulfolobus Turreted Icosahedral Virus, STIV. We present here the crystal structure of the protein (Mr = 13.1 kD, 109 residues encoded by the AFV3 ORF 109 in two different crystal forms at 1.5 and 1.3 Å resolution. The structure of AFV3-109 is a five stranded β-sheet with loops on one side and three helices on the other. It forms a dimer adopting the shape of a cradle that encompasses the best conserved regions of the sequence. No protein with a related fold could be identified except for the ortholog from STIV1, whose structure was deposited at the Protein Data Bank. We could clearly identify a well bound glycerol inside the cradle, contacting exclusively totally conserved residues. This interaction was confirmed in solution by fluorescence titration. Although the function of AFV3-109 cannot be deduced directly from its structure, structural homology with the STIV1 protein, and the size and charge distribution of the cavity suggested it could interact with nucleic acids. Fluorescence quenching titrations also showed that AFV3-109 interacts with dsDNA. Genomic sequence analysis revealed bacterial homologs of AFV3-109 as a part of a putative previously unidentified prophage sequences in some Firmicutes.

  13. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  14. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  15. Development of Multi-Scale Finite Element Analysis Codes for High Formability Sheet Metal Generation

    International Nuclear Information System (INIS)

    Nnakamachi, Eiji; Kuramae, Hiroyuki; Ngoc Tam, Nguyen; Nakamura, Yasunori; Sakamoto, Hidetoshi; Morimoto, Hideo

    2007-01-01

    In this study, the dynamic- and static-explicit multi-scale finite element (F.E.) codes are developed by employing the homogenization method, the crystalplasticity constitutive equation and SEM-EBSD measurement based polycrystal model. These can predict the crystal morphological change and the hardening evolution at the micro level, and the macroscopic plastic anisotropy evolution. These codes are applied to analyze the asymmetrical rolling process, which is introduced to control the crystal texture of the sheet metal for generating a high formability sheet metal. These codes can predict the yield surface and the sheet formability by analyzing the strain path dependent yield, the simple sheet forming process, such as the limit dome height test and the cylindrical deep drawing problems. It shows that the shear dominant rolling process, such as the asymmetric rolling, generates ''high formability'' textures and eventually the high formability sheet. The texture evolution and the high formability of the newly generated sheet metal experimentally were confirmed by the SEM-EBSD measurement and LDH test. It is concluded that these explicit type crystallographic homogenized multi-scale F.E. code could be a comprehensive tool to predict the plastic induced texture evolution, anisotropy and formability by the rolling process and the limit dome height test analyses

  16. Conservative Management for Stable High Ankle Injuries in Professional Football Players.

    Science.gov (United States)

    Knapik, Derrick M; Trem, Anthony; Sheehan, Joseph; Salata, Michael J; Voos, James E

    High ankle "syndesmosis" injuries are common in American football players relative to the general population. At the professional level, syndesmotic sprains represent a challenging and unique injury lacking a standardized rehabilitation protocol during conservative management. PubMed, Biosis Preview, SPORTDiscus, PEDro, and EMBASE databases were searched using the terms syndesmotic injuries, American football, conservative management, and rehabilitation. Clinical review. Level 3. When compared with lateral ankle sprains, syndesmosis injuries result in significantly prolonged recovery times and games lost. For stable syndesmotic injuries, conservative management features a brief period of immobilization and protected weightbearing followed by progressive strengthening exercises and running, and athletes can expect to return to competition in 2 to 6 weeks. Further research investigating the efficacy of dry needling and blood flow restriction therapy is necessary to evaluate the benefit of these techniques in the rehabilitation process. Successful conservative management of stable syndesmotic injuries in professional American football athletes requires a thorough understanding of the anatomy, injury mechanisms, diagnosis, and rehabilitation strategies utilized in elite athletes.

  17. Constrained dansyl derivatives reveal bacterial specificity of highly conserved thymidylate synthases.

    Science.gov (United States)

    Calò, Sanuele; Tondi, Donatella; Ferrari, Stefania; Venturelli, Alberto; Ghelli, Stefano; Costi, Maria Paola

    2008-03-25

    The elucidation of the structural/functional specificities of highly conserved enzymes remains a challenging area of investigation, and enzymes involved in cellular replication are important targets for functional studies and drug discovery. Thymidylate synthase (TS, ThyA) governs the synthesis of thymidylate for use in DNA synthesis. The present study focused on Lactobacillus casei TS (LcTS) and Escherichia coli TS (EcTS), which exhibit 50 % sequence identity and strong folding similarity. We have successfully designed and validated a chemical model in which linear, but not constrained, dansyl derivatives specifically complement the LcTS active site. Conversely, chemically constrained dansyl derivatives showed up to 1000-fold improved affinity for EcTS relative to the inhibitory activity of linear derivatives. This study demonstrates that the accurate design of small ligands can uncover functional features of highly conserved enzymes.

  18. Enhancing Conservation with High Resolution Productivity Datasets for the Conterminous United States

    Science.gov (United States)

    Robinson, Nathaniel Paul

    Human driven alteration of the earth's terrestrial surface is accelerating through land use changes, intensification of human activity, climate change, and other anthropogenic pressures. These changes occur at broad spatio-temporal scales, challenging our ability to effectively monitor and assess the impacts and subsequent conservation strategies. While satellite remote sensing (SRS) products enable monitoring of the earth's terrestrial surface continuously across space and time, the practical applications for conservation and management of these products are limited. Often the processes driving ecological change occur at fine spatial resolutions and are undetectable given the resolution of available datasets. Additionally, the links between SRS data and ecologically meaningful metrics are weak. Recent advances in cloud computing technology along with the growing record of high resolution SRS data enable the development of SRS products that quantify ecologically meaningful variables at relevant scales applicable for conservation and management. The focus of my dissertation is to improve the applicability of terrestrial gross and net primary productivity (GPP/NPP) datasets for the conterminous United States (CONUS). In chapter one, I develop a framework for creating high resolution datasets of vegetation dynamics. I use the entire archive of Landsat 5, 7, and 8 surface reflectance data and a novel gap filling approach to create spatially continuous 30 m, 16-day composites of the normalized difference vegetation index (NDVI) from 1986 to 2016. In chapter two, I integrate this with other high resolution datasets and the MOD17 algorithm to create the first high resolution GPP and NPP datasets for CONUS. I demonstrate the applicability of these products for conservation and management, showing the improvements beyond currently available products. In chapter three, I utilize this dataset to evaluate the relationships between land ownership and terrestrial production

  19. Entropy Viscosity Method for High-Order Approximations of Conservation Laws

    KAUST Repository

    Guermond, J. L.

    2010-09-17

    A stabilization technique for conservation laws is presented. It introduces in the governing equations a nonlinear dissipation function of the residual of the associated entropy equation and bounded from above by a first order viscous term. Different two-dimensional test cases are simulated - a 2D Burgers problem, the "KPP rotating wave" and the Euler system - using high order methods: spectral elements or Fourier expansions. Details on the tuning of the parameters controlling the entropy viscosity are given. © 2011 Springer.

  20. Entropy Viscosity Method for High-Order Approximations of Conservation Laws

    KAUST Repository

    Guermond, J. L.; Pasquetti, R.

    2010-01-01

    A stabilization technique for conservation laws is presented. It introduces in the governing equations a nonlinear dissipation function of the residual of the associated entropy equation and bounded from above by a first order viscous term. Different two-dimensional test cases are simulated - a 2D Burgers problem, the "KPP rotating wave" and the Euler system - using high order methods: spectral elements or Fourier expansions. Details on the tuning of the parameters controlling the entropy viscosity are given. © 2011 Springer.

  1. Two dimensional code for modeling of high ione cyclotron harmonic fast wave heating and current drive

    International Nuclear Information System (INIS)

    Grekov, D.; Kasilov, S.; Kernbichler, W.

    2016-01-01

    A two dimensional numerical code for computation of the electromagnetic field of a fast magnetosonic wave in a tokamak at high harmonics of the ion cyclotron frequency has been developed. The code computes the finite difference solution of Maxwell equations for separate toroidal harmonics making use of the toroidal symmetry of tokamak plasmas. The proper boundary conditions are prescribed at the realistic tokamak vessel. The currents in the RF antenna are specified externally and then used in Ampere law. The main poloidal tokamak magnetic field and the ''kinetic'' part of the dielectric permeability tensor are treated iteratively. The code has been verified against known analytical solutions and first calculations of current drive in the spherical torus are presented.

  2. High-speed architecture for the decoding of trellis-coded modulation

    Science.gov (United States)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  3. Modification in the FUDA computer code to predict fuel performance at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Das, M; Arunakumar, B V; Prasad, P N [Nuclear Power Corp., Mumbai (India)

    1997-08-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig.

  4. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Science.gov (United States)

    Nawawi, N. M.; Anuar, M. S.; Junita, M. N.; Rashidi, C. B. M.

    2017-11-01

    In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA) system using Zero Cross Correlation (ZCC) code and multiband Orthogonal Frequency Division Multiplexing (OFDM) called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  5. High performance mixed optical CDMA system using ZCC code and multiband OFDM

    Directory of Open Access Journals (Sweden)

    Nawawi N. M.

    2017-01-01

    Full Text Available In this paper, we have proposed a high performance network design, which is based on mixed optical Code Division Multiple Access (CDMA system using Zero Cross Correlation (ZCC code and multiband Orthogonal Frequency Division Multiplexing (OFDM called catenated OFDM. In addition, we also investigate the related changing parameters such as; effective power, number of user, number of band, code length and code weight. Then we theoretically analyzed the system performance comprehensively while considering up to five OFDM bands. The feasibility of the proposed system architecture is verified via the numerical analysis. The research results demonstrated that our developed modulation solution can significantly enhanced the total number of user; improving up to 80% for five catenated bands compared to traditional optical CDMA system, with the code length equals to 80, transmitted at 622 Mbps. It is also demonstrated that the BER performance strongly depends on number of weight, especially with less number of users. As the number of weight increases, the BER performance is better.

  6. Modification in the FUDA computer code to predict fuel performance at high burnup

    International Nuclear Information System (INIS)

    Das, M.; Arunakumar, B.V.; Prasad, P.N.

    1997-01-01

    The computer code FUDA (FUel Design Analysis) participated in the blind exercises organized by the IAEA CRP (Co-ordinated Research Programme) on FUMEX (Fuel Modelling at Extended Burnup). While the code prediction compared well with the experiments at Halden under various parametric and operating conditions, the fission gas release and fission gas pressure were found to be slightly over-predicted, particularly at high burnups. In view of the results of 6 FUMEX cases, the main models and submodels of the code were reviewed and necessary improvements were made. The new version of the code FUDA MOD 2 is now able to predict fuel performance parameter for burn-ups up to 50000 MWD/TeU. The validation field of the code has been extended to prediction of thorium oxide fuel performance. An analysis of local deformations at pellet interfaces and near the end caps is carried out considering the hourglassing of the pellet by finite element technique. (author). 15 refs, 1 fig

  7. High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique

    Science.gov (United States)

    Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan

    2015-10-01

    In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.

  8. Load balancing in highly parallel processing of Monte Carlo code for particle transport

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Takemiya, Hiroshi; Kawasaki, Takuji

    1998-01-01

    In parallel processing of Monte Carlo (MC) codes for neutron, photon and electron transport problems, particle histories are assigned to processors making use of independency of the calculation for each particle. Although we can easily parallelize main part of a MC code by this method, it is necessary and practically difficult to optimize the code concerning load balancing in order to attain high speedup ratio in highly parallel processing. In fact, the speedup ratio in the case of 128 processors remains in nearly one hundred times when using the test bed for the performance evaluation. Through the parallel processing of the MCNP code, which is widely used in the nuclear field, it is shown that it is difficult to attain high performance by static load balancing in especially neutron transport problems, and a load balancing method, which dynamically changes the number of assigned particles minimizing the sum of the computational and communication costs, overcomes the difficulty, resulting in nearly fifteen percentage of reduction for execution time. (author)

  9. SimProp: a simulation code for ultra high energy cosmic ray propagation

    International Nuclear Information System (INIS)

    Aloisio, R.; Grillo, A.F.; Boncioli, D.; Petrera, S.; Salamida, F.

    2012-01-01

    A new Monte Carlo simulation code for the propagation of Ultra High Energy Cosmic Rays is presented. The results of this simulation scheme are tested by comparison with results of another Monte Carlo computation as well as with the results obtained by directly solving the kinetic equation for the propagation of Ultra High Energy Cosmic Rays. A short comparison with the latest flux published by the Pierre Auger collaboration is also presented

  10. ORBIT: A CODE FOR COLLECTIVE BEAM DYNAMICS IN HIGH INTENSITY RINGS

    International Nuclear Information System (INIS)

    HOLMES, J.A.; DANILOV, V.; GALAMBOS, J.; SHISHLO, A.; COUSINEAU, S.; CHOU, W.; MICHELOTTI, L.; OSTIGUY, J.F.; WEI, J.

    2002-01-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK, the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings

  11. ORBIT: A Code for Collective Beam Dynamics in High-Intensity Rings

    Science.gov (United States)

    Holmes, J. A.; Danilov, V.; Galambos, J.; Shishlo, A.; Cousineau, S.; Chou, W.; Michelotti, L.; Ostiguy, J.-F.; Wei, J.

    2002-12-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK; the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings.

  12. ORBIT: A code for collective beam dynamics in high-intensity rings

    International Nuclear Information System (INIS)

    Holmes, J.A.; Danilov, V.; Galambos, J.; Shishlo, A.; Cousineau, S.; Chou, W.; Michelotti, L.; Ostiguy, J.-F.; Wei, J.

    2002-01-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK; the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings

  13. High-Penetration Photovoltaics Standards and Codes Workshop, Denver, Colorado, May 20, 2010: Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, M.; Kroposki, B.; Basso, T.; Lynn, K.; Herig, C.; Bower, W.

    2010-09-01

    Effectively interconnecting high-level penetration of photovoltaic (PV) systems requires careful technical attention to ensuring compatibility with electric power systems. Standards, codes, and implementation have been cited as major impediments to widespread use of PV within electric power systems. On May 20, 2010, in Denver, Colorado, the National Renewable Energy Laboratory, in conjunction with the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), held a workshop to examine the key technical issues and barriers associated with high PV penetration levels with an emphasis on codes and standards. This workshop included building upon results of the High Penetration of Photovoltaic (PV) Systems into the Distribution Grid workshop held in Ontario California on February 24-25, 2009, and upon the stimulating presentations of the diverse stakeholder presentations.

  14. The conservation pattern of short linear motifs is highly correlated with the function of interacting protein domains

    Directory of Open Access Journals (Sweden)

    Wang Yiguo

    2008-10-01

    Full Text Available Abstract Background Many well-represented domains recognize primary sequences usually less than 10 amino acids in length, called Short Linear Motifs (SLiMs. Accurate prediction of SLiMs has been difficult because they are short (often Results Our combined approach revealed that SLiMs are highly conserved in proteins from functional classes that are known to interact with a specific domain, but that they are not conserved in most other protein groups. We found that SLiMs recognized by SH2 domains were highly conserved in receptor kinases/phosphatases, adaptor molecules, and tyrosine kinases/phosphatases, that SLiMs recognized by SH3 domains were highly conserved in cytoskeletal and cytoskeletal-associated proteins, that SLiMs recognized by PDZ domains were highly conserved in membrane proteins such as channels and receptors, and that SLiMs recognized by S/T kinase domains were highly conserved in adaptor molecules, S/T kinases/phosphatases, and proteins involved in transcription or cell cycle control. We studied Tyr-SLiMs recognized by SH2 domains in more detail, and found that SH2-recognized Tyr-SLiMs on the cytoplasmic side of membrane proteins are more highly conserved than those on the extra-cellular side. Also, we found that SH2-recognized Tyr-SLiMs that are associated with SH3 motifs and a tyrosine kinase phosphorylation motif are more highly conserved. Conclusion The interactome of protein domains is reflected by the evolutionary conservation of SLiMs recognized by these domains. Combining scoring matrixes derived from peptide libraries and conservation analysis, we would be able to find those protein groups that are more likely to interact with specific domains.

  15. Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes

    Science.gov (United States)

    Schreier, Franz; Milz, Mathias; Buehler, Stefan A.; von Clarmann, Thomas

    2018-05-01

    An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric radiative transfer and remote sensing - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the 19 HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. The mutual differences of the equivalent brightness temperatures are presented and possible causes of disagreement are discussed. In particular, the impact of path integration schemes and atmospheric layer discretization is assessed. When the continuum absorption contribution is ignored because of the different implementations, residuals are generally in the sub-Kelvin range and smaller than 0.1 K for some window channels (and all atmospheric models and lbl codes). None of the three codes turned out to be perfect for all channels and atmospheres. Remaining discrepancies are attributed to different lbl optimization techniques. Lbl codes seem to have reached a maturity in the implementation of radiative transfer that the choice of the underlying physical models (line shape models, continua etc) becomes increasingly relevant.

  16. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  17. Towards high dynamic range extensions of HEVC: subjective evaluation of potential coding technologies

    Science.gov (United States)

    Hanhart, Philippe; Řeřábek, Martin; Ebrahimi, Touradj

    2015-09-01

    This paper reports the details and results of the subjective evaluations conducted at EPFL to evaluate the responses to the Call for Evidence (CfE) for High Dynamic Range (HDR) and Wide Color Gamut (WCG) Video Coding issued by Moving Picture Experts Group (MPEG). The CfE on HDR/WCG Video Coding aims to explore whether the coding efficiency and/or the functionality of the current version of HEVC standard can be signi_cantly improved for HDR and WCG content. In total, nine submissions, five for Category 1 and four for Category 3a, were compared to the HEVC Main 10 Profile based Anchor. More particularly, five HDR video contents, compressed at four bit rates by each proponent responding to the CfE, were used in the subjective evaluations. Further, the side-by-side presentation methodology was used for the subjective experiment to discriminate small differences between the Anchor and proponents. Subjective results shows that the proposals provide evidence that the coding efficiency can be improved in a statistically noticeable way over MPEG CfE Anchors in terms of perceived quality within the investigated content. The paper further benchmarks the selected objective metrics based on their correlations with the subjective ratings. It is shown that PSNR-DE1000, HDRVDP- 2, and PSNR-Lx can reliably detect visible differences between the proposed encoding solutions and current HEVC standard.

  18. Quasi Cyclic Low Density Parity Check Code for High SNR Data Transfer

    Directory of Open Access Journals (Sweden)

    M. R. Islam

    2010-06-01

    Full Text Available An improved Quasi Cyclic Low Density Parity Check code (QC-LDPC is proposed to reduce the complexity of the Low Density Parity Check code (LDPC while obtaining the similar performance. The proposed QC-LDPC presents an improved construction at high SNR with circulant sub-matrices. The proposed construction yields a performance gain of about 1 dB at a 0.0003 bit error rate (BER and it is tested on 4 different decoding algorithms. Proposed QC-LDPC is compared with the existing QC-LDPC and the simulation results show that the proposed approach outperforms the existing one at high SNR. Simulations are also performed varying the number of horizontal sub matrices and the results show that the parity check matrix with smaller horizontal concatenation shows better performance.

  19. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    International Nuclear Information System (INIS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements. (paper)

  20. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  1. Species Richness and Community Structure on a High Latitude Reef: Implications for Conservation and Management

    Directory of Open Access Journals (Sweden)

    Wayne Houston

    2011-07-01

    Full Text Available In spite of the wealth of research on the Great Barrier Reef, few detailed biodiversity assessments of its inshore coral communities have been conducted. Effective conservation and management of marine ecosystems begins with fine-scale biophysical assessments focused on diversity and the architectural species that build the structural framework of the reef. In this study, we investigate key coral diversity and environmental attributes of an inshore reef system surrounding the Keppel Bay Islands near Rockhampton in Central Queensland, Australia, and assess their implications for conservation and management. The Keppels has much higher coral diversity than previously found. The average species richness for the 19 study sites was ~40 with representatives from 68% of the ~244 species previously described for the southern Great Barrier Reef. Using scleractinian coral species richness, taxonomic distinctiveness and coral cover as the main criteria, we found that five out of 19 sites had particularly high conservation value. A further site was also considered to be of relatively high value. Corals at this site were taxonomically distinct from the others (representatives of two families were found here but not at other sites and a wide range of functionally diverse taxa were present. This site was associated with more stressful conditions such as high temperatures and turbidity. Highly diverse coral communities or biodiversity ‘hotspots’ and taxonomically distinct reefs may act as insurance policies for climatic disturbance, much like Noah’s Arks for reefs. While improving water quality and limiting anthropogenic impacts are clearly important management initiatives to improve the long-term outlook for inshore reefs, identifying, mapping and protecting these coastal ‘refugia’ may be the key for ensuring their regeneration against catastrophic climatic disturbance in the meantime.

  2. Impact of local empowerment on conservation practices in a highly developed country

    OpenAIRE

    Engen, Sigrid; Hausner, Vera Helene

    2017-01-01

    Source at http://dx.doi.org/10.1111/conl.12369 Community-based conservation, where local decision makers are responsible for balancing conservation and development, is often preferred to exclusion- ary conservation that prioritizes use-limitation through strict regulation. Un- raveling the evidence for conservation impact of different governance regimes is challenging. Focusing on conservation practices before and after a reform can provide an early indication of behaviora...

  3. Use of ancient sedimentary DNA as a novel conservation tool for high-altitude tropical biodiversity.

    Science.gov (United States)

    Boessenkool, Sanne; McGlynn, Gayle; Epp, Laura S; Taylor, David; Pimentel, Manuel; Gizaw, Abel; Nemomissa, Sileshi; Brochmann, Christian; Popp, Magnus

    2014-04-01

    Conservation of biodiversity may in the future increasingly depend upon the availability of scientific information to set suitable restoration targets. In traditional paleoecology, sediment-based pollen provides a means to define preanthropogenic impact conditions, but problems in establishing the exact provenance and ecologically meaningful levels of taxonomic resolution of the evidence are limiting. We explored the extent to which the use of sedimentary ancient DNA (sedaDNA) may complement pollen data in reconstructing past alpine environments in the tropics. We constructed a record of afro-alpine plants retrieved from DNA preserved in sediment cores from 2 volcanic crater sites in the Albertine Rift, eastern Africa. The record extended well beyond the onset of substantial anthropogenic effects on tropical mountains. To ensure high-quality taxonomic inference from the sedaDNA sequences, we built an extensive DNA reference library covering the majority of the afro-alpine flora, by sequencing DNA from taxonomically verified specimens. Comparisons with pollen records from the same sediment cores showed that plant diversity recovered with sedaDNA improved vegetation reconstructions based on pollen records by revealing both additional taxa and providing increased taxonomic resolution. Furthermore, combining the 2 measures assisted in distinguishing vegetation change at different geographic scales; sedaDNA almost exclusively reflects local vegetation, whereas pollen can potentially originate from a wide area that in highlands in particular can span several ecozones. Our results suggest that sedaDNA may provide information on restoration targets and the nature and magnitude of human-induced environmental changes, including in high conservation priority, biodiversity hotspots, where understanding of preanthropogenic impact (or reference) conditions is highly limited. © 2013 Society for Conservation Biology.

  4. Representing high-dimensional data to intelligent prostheses and other wearable assistive robots: A first comparison of tile coding and selective Kanerva coding.

    Science.gov (United States)

    Travnik, Jaden B; Pilarski, Patrick M

    2017-07-01

    Prosthetic devices have advanced in their capabilities and in the number and type of sensors included in their design. As the space of sensorimotor data available to a conventional or machine learning prosthetic control system increases in dimensionality and complexity, it becomes increasingly important that this data be represented in a useful and computationally efficient way. Well structured sensory data allows prosthetic control systems to make informed, appropriate control decisions. In this study, we explore the impact that increased sensorimotor information has on current machine learning prosthetic control approaches. Specifically, we examine the effect that high-dimensional sensory data has on the computation time and prediction performance of a true-online temporal-difference learning prediction method as embedded within a resource-limited upper-limb prosthesis control system. We present results comparing tile coding, the dominant linear representation for real-time prosthetic machine learning, with a newly proposed modification to Kanerva coding that we call selective Kanerva coding. In addition to showing promising results for selective Kanerva coding, our results confirm potential limitations to tile coding as the number of sensory input dimensions increases. To our knowledge, this study is the first to explicitly examine representations for realtime machine learning prosthetic devices in general terms. This work therefore provides an important step towards forming an efficient prosthesis-eye view of the world, wherein prompt and accurate representations of high-dimensional data may be provided to machine learning control systems within artificial limbs and other assistive rehabilitation technologies.

  5. Proteome-wide mapping of the Drosophila acetylome demonstrates a high degree of conservation of lysine acetylation

    DEFF Research Database (Denmark)

    Weinert, Brian T; Wagner, Sebastian A; Horn, Heiko

    2011-01-01

    Posttranslational modification of proteins by acetylation and phosphorylation regulates most cellular processes in living organisms. Surprisingly, the evolutionary conservation of phosphorylated serine and threonine residues is only marginally higher than that of unmodified serines and threonines....... With high-resolution mass spectrometry, we identified 1981 lysine acetylation sites in the proteome of Drosophila melanogaster. We used data sets of experimentally identified acetylation and phosphorylation sites in Drosophila and humans to analyze the evolutionary conservation of these modification sites...... between flies and humans. Site-level conservation analysis revealed that acetylation sites are highly conserved, significantly more so than phosphorylation sites. Furthermore, comparison of lysine conservation in Drosophila and humans with that in nematodes and zebrafish revealed that acetylated lysines...

  6. Stand-alone front-end system for high- frequency, high-frame-rate coded excitation ultrasonic imaging.

    Science.gov (United States)

    Park, Jinhyoung; Hu, Changhong; Shung, K Kirk

    2011-12-01

    A stand-alone front-end system for high-frequency coded excitation imaging was implemented to achieve a wider dynamic range. The system included an arbitrary waveform amplifier, an arbitrary waveform generator, an analog receiver, a motor position interpreter, a motor controller and power supplies. The digitized arbitrary waveforms at a sampling rate of 150 MHz could be programmed and converted to an analog signal. The pulse was subsequently amplified to excite an ultrasound transducer, and the maximum output voltage level achieved was 120 V(pp). The bandwidth of the arbitrary waveform amplifier was from 1 to 70 MHz. The noise figure of the preamplifier was less than 7.7 dB and the bandwidth was 95 MHz. Phantoms and biological tissues were imaged at a frame rate as high as 68 frames per second (fps) to evaluate the performance of the system. During the measurement, 40-MHz lithium niobate (LiNbO(3)) single-element lightweight (<;0.28 g) transducers were utilized. The wire target measure- ment showed that the -6-dB axial resolution of a chirp-coded excitation was 50 μm and lateral resolution was 120 μm. The echo signal-to-noise ratios were found to be 54 and 65 dB for the short burst and coded excitation, respectively. The contrast resolution in a sphere phantom study was estimated to be 24 dB for the chirp-coded excitation and 15 dB for the short burst modes. In an in vivo study, zebrafish and mouse hearts were imaged. Boundaries of the zebrafish heart in the image could be differentiated because of the low-noise operation of the implemented system. In mouse heart images, valves and chambers could be readily visualized with the coded excitation.

  7. A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding

    Science.gov (United States)

    Simon, M. K.; Divsalar, D.

    2001-01-01

    Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.

  8. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Science.gov (United States)

    Stewart, Kyle R.; Maller, Ariyeh H.; Oñorbe, Jose; Bullock, James S.; Joung, M. Ryan; Devriendt, Julien; Ceverino, Daniel; Kereš, Dušan; Hopkins, Philip F.; Faucher-Giguère, Claude-André

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ˜4 times more specific angular momentum in cold halo gas (λ cold ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  9. Earth Experiments in a Virtual World: Introducing Climate & Coding to High School Girls

    Science.gov (United States)

    Singh, H. A.; Twedt, J. R.

    2017-12-01

    In our increasingly technologically-driven and information-saturated world, literacy in STEM fields can be crucial for career advancement. Nevertheless, both systemic and interpersonal barriers can prevent individuals, particularly members of under-represented groups, from engaging in these fields. Here, we present a high school-level workshop developed to foster basic understanding of climate science while exposing students to the Python programming language. For the past four years, the workshop has been a part of the annual Expanding Your Horizons conference for high school girls, whose mission is to spark interest in STEM fields. Moving through current events in the realm of global climate policy, the fundamentals of climate, and the mathematical representation of planetary energy balance, the workshop culminates in an under-the-hood exploration of a basic climate model coded in the Python programming language. Students interact directly with the underlying code to run `virtual world' experiments that explore the impact of solar insolation, planetary albedo, the greenhouse effect, and meridional energy transport on global temperatures. Engagement with Python is through the Jupyter Notebook interface, which permits direct interaction with the code but is more user-friendly for beginners than a command-line approach. We conclude with further ideas for providing online access to workshop materials for educators, and additional venues for presenting such workshops to under-represented groups in STEM.

  10. Single stock dynamics on high-frequency data: from a compressed coding perspective.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    Full Text Available High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS, and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors.

  11. High Angular Momentum Halo Gas: A Feedback and Code-independent Prediction of LCDM

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kyle R. [Department of Mathematical Sciences, California Baptist University, 8432 Magnolia Ave., Riverside, CA 92504 (United States); Maller, Ariyeh H. [Department of Physics, New York City College of Technology, 300 Jay St., Brooklyn, NY 11201 (United States); Oñorbe, Jose [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Bullock, James S. [Center for Cosmology, Department of Physics and Astronomy, The University of California at Irvine, Irvine, CA 92697 (United States); Joung, M. Ryan [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Devriendt, Julien [Department of Physics, University of Oxford, The Denys Wilkinson Building, Keble Rd., Oxford OX1 3RH (United Kingdom); Ceverino, Daniel [Zentrum für Astronomie der Universität Heidelberg, Institut für Theoretische Astrophysik, Albert-Ueberle-Str. 2, D-69120 Heidelberg (Germany); Kereš, Dušan [Department of Physics, Center for Astrophysics and Space Sciences, University of California at San Diego, 9500 Gilman Dr., La Jolla, CA 92093 (United States); Hopkins, Philip F. [California Institute of Technology, 1200 E. California Blvd., Pasadena, CA 91125 (United States); Faucher-Giguère, Claude-André [Department of Physics and Astronomy and CIERA, Northwestern University, 2145 Sheridan Rd., Evanston, IL 60208 (United States)

    2017-07-01

    We investigate angular momentum acquisition in Milky Way-sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy’s halo. In all simulations, cold filamentary gas accretion to the halo results in ∼4 times more specific angular momentum in cold halo gas ( λ {sub cold} ≳ 0.1) than in the dark matter halo. At z > 1, this inflow takes the form of inspiraling cold streams that are co-directional in the halo of the galaxy and are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the qualitative agreement among disparate simulations, we conclude that the buildup of high angular momentum halo gas and the presence of these inspiraling cold streams are robust predictions of Lambda Cold Dark Matter galaxy formation, though the detailed morphology of these streams is significantly less certain. A growing body of observational evidence suggests that this process is borne out in the real universe.

  12. Low-Intensity Agricultural Landscapes in Transylvania Support High Butterfly Diversity: Implications for Conservation

    Science.gov (United States)

    Loos, Jacqueline; Dorresteijn, Ine; Hanspach, Jan; Fust, Pascal; Rakosy, László; Fischer, Joern

    2014-01-01

    European farmland biodiversity is declining due to land use changes towards agricultural intensification or abandonment. Some Eastern European farming systems have sustained traditional forms of use, resulting in high levels of biodiversity. However, global markets and international policies now imply rapid and major changes to these systems. To effectively protect farmland biodiversity, understanding landscape features which underpin species diversity is crucial. Focusing on butterflies, we addressed this question for a cultural-historic landscape in Southern Transylvania, Romania. Following a natural experiment, we randomly selected 120 survey sites in farmland, 60 each in grassland and arable land. We surveyed butterfly species richness and abundance by walking transects with four repeats in summer 2012. We analysed species composition using Detrended Correspondence Analysis. We modelled species richness, richness of functional groups, and abundance of selected species in response to topography, woody vegetation cover and heterogeneity at three spatial scales, using generalised linear mixed effects models. Species composition widely overlapped in grassland and arable land. Composition changed along gradients of heterogeneity at local and context scales, and of woody vegetation cover at context and landscape scales. The effect of local heterogeneity on species richness was positive in arable land, but negative in grassland. Plant species richness, and structural and topographic conditions at multiple scales explained species richness, richness of functional groups and species abundances. Our study revealed high conservation value of both grassland and arable land in low-intensity Eastern European farmland. Besides grassland, also heterogeneous arable land provides important habitat for butterflies. While butterfly diversity in arable land benefits from heterogeneity by small-scale structures, grasslands should be protected from fragmentation to provide

  13. A high capacity text steganography scheme based on LZW compression and color coding

    Directory of Open Access Journals (Sweden)

    Aruna Malik

    2017-02-01

    Full Text Available In this paper, capacity and security issues of text steganography have been considered by employing LZW compression technique and color coding based approach. The proposed technique uses the forward mail platform to hide the secret data. This algorithm first compresses secret data and then hides the compressed secret data into the email addresses and also in the cover message of the email. The secret data bits are embedded in the message (or cover text by making it colored using a color coding table. Experimental results show that the proposed method not only produces a high embedding capacity but also reduces computational complexity. Moreover, the security of the proposed method is significantly improved by employing stego keys. The superiority of the proposed method has been experimentally verified by comparing with recently developed existing techniques.

  14. Stitching Codeable Circuits: High School Students' Learning About Circuitry and Coding with Electronic Textiles

    Science.gov (United States)

    Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.

    2017-10-01

    Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.

  15. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  16. Game-Theoretic Rate-Distortion-Complexity Optimization of High Efficiency Video Coding

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Milani, Simone; Forchhammer, Søren

    2013-01-01

    profiles in order to tailor the computational load to the different hardware and power-supply resources of devices. In this work, we focus on optimizing the quantization parameter and partition depth in HEVC via a game-theoretic approach. The proposed rate control strategy alone provides 0.2 dB improvement......This paper presents an algorithm for rate-distortioncomplexity optimization for the emerging High Efficiency Video Coding (HEVC) standard, whose high computational requirements urge the need for low-complexity optimization algorithms. Optimization approaches need to specify different complexity...

  17. Recent developments of the TRANSURANUS code with emphasis on high burnup phenomena

    International Nuclear Information System (INIS)

    Lassmann, K.; Schubert, A.; Laar, J. van de; Vennix, C.W.H.M.

    2001-01-01

    TRANSURANUS is a computer program for the thermal and mechanical analysis of fuel rods in nuclear reactors, which is developed at the Institute for Transuranium Elements. The code is in use in several European organisations, both in research and industry. In the paper the recent developments are summarised: the burnup degradation of the fuel thermal conductivity as well as the effects of gadolinium on the radial power distribution and thermal conductivity. Fission gas release from the High Burnup Structure is discussed. Finally, a new numerical method is outlined that is able to treat the highly non-linear mechanical equations in transients (RIAs and LOCAs). (author)

  18. High-order conservative discretizations for some cases of the rigid body motion

    International Nuclear Information System (INIS)

    Kozlov, Roman

    2008-01-01

    Modified vector fields can be used to construct high-order structure-preserving numerical integrators for ordinary differential equations. In the present Letter we consider high-order integrators based on the implicit midpoint rule, which conserve quadratic first integrals. It is shown that these integrators are particularly suitable for the rigid body motion with an additional quadratic first integral. In this case high-order integrators preserve all four first integrals of motion. The approach is illustrated on the Lagrange top (a rotationally symmetric rigid body with a fixed point on the symmetry axis). The equations of motion are considered in the space fixed frame because in this frame Lagrange top admits a neat description. The Lagrange top motion includes the spherical pendulum and the planar pendulum, which swings in a vertical plane, as particular cases

  19. Accuracy of Administrative Codes for Distinguishing Positive Pressure Ventilation from High-Flow Nasal Cannula.

    Science.gov (United States)

    Good, Ryan J; Leroue, Matthew K; Czaja, Angela S

    2018-06-07

    Noninvasive positive pressure ventilation (NIPPV) is increasingly used in critically ill pediatric patients, despite limited data on safety and efficacy. Administrative data may be a good resource for observational studies. Therefore, we sought to assess the performance of the International Classification of Diseases, Ninth Revision procedure code for NIPPV. Patients admitted to the PICU requiring NIPPV or heated high-flow nasal cannula (HHFNC) over the 11-month study period were identified from the Virtual PICU System database. The gold standard was manual review of the electronic health record to verify the use of NIPPV or HHFNC among the cohort. The presence or absence of a NIPPV procedure code was determined by using administrative data. Test characteristics with 95% confidence intervals (CIs) were generated, comparing administrative data with the gold standard. Among the cohort ( n = 562), the majority were younger than 5 years, and the most common primary diagnosis was bronchiolitis. Most (82%) required NIPPV, whereas 18% required only HHFNC. The NIPPV code had a sensitivity of 91.1% (95% CI: 88.2%-93.6%) and a specificity of 57.6% (95% CI: 47.2%-67.5%), with a positive likelihood ratio of 2.15 (95% CI: 1.70-2.71) and negative likelihood ratio of 0.15 (95% CI: 0.11-0.22). Among our critically ill pediatric cohort, NIPPV procedure codes had high sensitivity but only moderate specificity. On the basis of our study results, there is a risk of misclassification, specifically failure to identify children who require NIPPV, when using administrative data to study the use of NIPPV in this population. Copyright © 2018 by the American Academy of Pediatrics.

  20. Traveling waves and conservation laws for highly nonlinear wave equations modeling Hertz chains

    Science.gov (United States)

    Przedborski, Michelle; Anco, Stephen C.

    2017-09-01

    A highly nonlinear, fourth-order wave equation that models the continuum theory of long wavelength pulses in weakly compressed, homogeneous, discrete chains with a general power-law contact interaction is studied. For this wave equation, all solitary wave solutions and all nonlinear periodic wave solutions, along with all conservation laws, are derived. The solutions are explicitly parameterized in terms of the asymptotic value of the wave amplitude in the case of solitary waves and the peak of the wave amplitude in the case of nonlinear periodic waves. All cases in which the solution expressions can be stated in an explicit analytic form using elementary functions are worked out. In these cases, explicit expressions for the total energy and total momentum for all solutions are obtained as well. The derivation of the solutions uses the conservation laws combined with an energy analysis argument to reduce the wave equation directly to a separable first-order differential equation that determines the wave amplitude in terms of the traveling wave variable. This method can be applied more generally to other highly nonlinear wave equations.

  1. Development of NONSTA code for the design and analysis of LMR high temperature structure

    International Nuclear Information System (INIS)

    Kim, Jong Bum; Lee, H. Y.; Yoo, B.

    1999-02-01

    Liquid metal reactor(LMR) operates at high temperature (500-550 dg C) and structural materials undergo complex deformation behavior like diffusion, dislocation glide, and dislocation climb due to high temperature environment. And the material life reduces rapidly due to the interaction of cavities created inside structural materials and high temperature fatigue cracks. Thus the establishment of high temperature structure analysis techniques is necessary for the reliability and safety evaluation of such structures. The objectives of this study are to develop NONSTA code as the subprogram of ABAQUS code adopting constitutive equations which can predict high temperature material behavior precisely and to build the systematic analysis procedures. The developed program was applied to the example problems such as the tensile analysis using exponential creep model and the repetitive tensile-compression analysis using Chaboche unified viscoplastic model. In addition, the problem of a plate with a center hole subjected to tensile load was solved to show the applicability of the program to multiaxial problem and the time dependent stress redistribution was observed. (Author). 40 refs., 2 tabs., 24 figs

  2. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2009-01-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electrolyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  3. A hardware-oriented concurrent TZ search algorithm for High-Efficiency Video Coding

    Science.gov (United States)

    Doan, Nghia; Kim, Tae Sung; Rhee, Chae Eun; Lee, Hyuk-Jae

    2017-12-01

    High-Efficiency Video Coding (HEVC) is the latest video coding standard, in which the compression performance is double that of its predecessor, the H.264/AVC standard, while the video quality remains unchanged. In HEVC, the test zone (TZ) search algorithm is widely used for integer motion estimation because it effectively searches the good-quality motion vector with a relatively small amount of computation. However, the complex computation structure of the TZ search algorithm makes it difficult to implement it in the hardware. This paper proposes a new integer motion estimation algorithm which is designed for hardware execution by modifying the conventional TZ search to allow parallel motion estimations of all prediction unit (PU) partitions. The algorithm consists of the three phases of zonal, raster, and refinement searches. At the beginning of each phase, the algorithm obtains the search points required by the original TZ search for all PU partitions in a coding unit (CU). Then, all redundant search points are removed prior to the estimation of the motion costs, and the best search points are then selected for all PUs. Compared to the conventional TZ search algorithm, experimental results show that the proposed algorithm significantly decreases the Bjøntegaard Delta bitrate (BD-BR) by 0.84%, and it also reduces the computational complexity by 54.54%.

  4. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  5. A Benchmarking Study of High Energy Carbon Ion Induced Neutron Using Several Monte Carlo Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Oh, J. H.; Jung, N. S.; Lee, H. S. [Pohang Accelerator Laboratory, Pohang (Korea, Republic of); Shin, Y. S.; Kwon, D. Y.; Kim, Y. M. [Catholic Univ., Gyeongsan (Korea, Republic of); Oranj, L. Mokhtari [POSTECH, Pohang (Korea, Republic of)

    2014-10-15

    In this study, the benchmarking study was done for the representative particle interaction of the heavy ion accelerator, especially carbon-induced reaction. The secondary neutron is an important particle in the shielding analysis to define the source term and penetration ability of radiation fields. The performance of each Monte Carlo codes were verified for selected codes: MCNPX 2.7, PHITS 2.64 and FLUKA 2011.2b.6. For this benchmarking study, the experimental data of Kurosawa et al. in the SINBAD database of NEA was applied. The calculated results of the differential neutron yield produced from several materials irradiated by high energy carbon beam reproduced the experimental data well in small uncertainty. But the MCNPX results showed large discrepancy with experimental data, especially at the forward angle. The calculated results were lower a little than the experimental and it was clear in the cases of lower incident carbon energy, thinner target and forward angle. As expected, the influence of different model was found clearly at forward direction. In the shielding analysis, these characteristics of each Monte Carlo codes should be considered and utilized to determine the safety margin of a shield thickness.

  6. A Benchmarking Study of High Energy Carbon Ion Induced Neutron Using Several Monte Carlo Codes

    International Nuclear Information System (INIS)

    Kim, D. H.; Oh, J. H.; Jung, N. S.; Lee, H. S.; Shin, Y. S.; Kwon, D. Y.; Kim, Y. M.; Oranj, L. Mokhtari

    2014-01-01

    In this study, the benchmarking study was done for the representative particle interaction of the heavy ion accelerator, especially carbon-induced reaction. The secondary neutron is an important particle in the shielding analysis to define the source term and penetration ability of radiation fields. The performance of each Monte Carlo codes were verified for selected codes: MCNPX 2.7, PHITS 2.64 and FLUKA 2011.2b.6. For this benchmarking study, the experimental data of Kurosawa et al. in the SINBAD database of NEA was applied. The calculated results of the differential neutron yield produced from several materials irradiated by high energy carbon beam reproduced the experimental data well in small uncertainty. But the MCNPX results showed large discrepancy with experimental data, especially at the forward angle. The calculated results were lower a little than the experimental and it was clear in the cases of lower incident carbon energy, thinner target and forward angle. As expected, the influence of different model was found clearly at forward direction. In the shielding analysis, these characteristics of each Monte Carlo codes should be considered and utilized to determine the safety margin of a shield thickness

  7. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  8. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  9. Transcoding method from H.264/AVC to high efficiency video coding based on similarity of intraprediction, interprediction, and motion vector

    Science.gov (United States)

    Liu, Mei-Feng; Zhong, Guo-Yun; He, Xiao-Hai; Qing, Lin-Bo

    2016-09-01

    Currently, most video resources on line are encoded in the H.264/AVC format. More fluent video transmission can be obtained if these resources are encoded in the newest international video coding standard: high efficiency video coding (HEVC). In order to improve the video transmission and storage on line, a transcoding method from H.264/AVC to HEVC is proposed. In this transcoding algorithm, the coding information of intraprediction, interprediction, and motion vector (MV) in H.264/AVC video stream are used to accelerate the coding in HEVC. It is found through experiments that the region of interprediction in HEVC overlaps that in H.264/AVC. Therefore, the intraprediction for the region in HEVC, which is interpredicted in H.264/AVC, can be skipped to reduce coding complexity. Several macroblocks in H.264/AVC are combined into one PU in HEVC when the MV difference between two of the macroblocks in H.264/AVC is lower than a threshold. This method selects only one coding unit depth and one prediction unit (PU) mode to reduce the coding complexity. An MV interpolation method of combined PU in HEVC is proposed according to the areas and distances between the center of one macroblock in H.264/AVC and that of the PU in HEVC. The predicted MV accelerates the motion estimation for HEVC coding. The simulation results show that our proposed algorithm achieves significant coding time reduction with a little loss in bitrates distortion rate, compared to the existing transcoding algorithms and normal HEVC coding.

  10. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  11. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  12. Determining the drivers of population structure in a highly urbanized landscape to inform conservation planning.

    Science.gov (United States)

    Thomassen, Henri A; Harrigan, Ryan J; Semple Delaney, Kathleen; Riley, Seth P D; Serieys, Laurel E K; Pease, Katherine; Wayne, Robert K; Smith, Thomas B

    2018-02-01

    Understanding the environmental contributors to population structure is of paramount importance for conservation in urbanized environments. We used spatially explicit models to determine genetic population structure under current and future environmental conditions across a highly fragmented, human-dominated environment in Southern California to assess the effects of natural ecological variation and urbanization. We focused on 7 common species with diverse habitat requirements, home-range sizes, and dispersal abilities. We quantified the relative roles of potential barriers, including natural environmental characteristics and an anthropogenic barrier created by a major highway, in shaping genetic variation. The ability to predict genetic variation in our models differed among species: 11-81% of intraspecific genetic variation was explained by environmental variables. Although an anthropogenically induced barrier (a major highway) severely restricted gene flow and movement at broad scales for some species, genetic variation seemed to be primarily driven by natural environmental heterogeneity at a local level. Our results show how assessing environmentally associated variation for multiple species under current and future climate conditions can help identify priority regions for maximizing population persistence under environmental change in urbanized regions. © 2017 Society for Conservation Biology.

  13. A highly conserved tyrosine of Tim-3 is phosphorylated upon stimulation by its ligand galectin-9

    International Nuclear Information System (INIS)

    Weyer, Philipp S. van de; Muehlfeit, Michael; Klose, Christoph; Bonventre, Joseph V.; Walz, Gerd; Kuehn, E. Wolfgang

    2006-01-01

    Tim-3 is a member of the TIM family of proteins (T-cell immunoglobulin mucin) involved in the regulation of CD4+ T-cells. Tim-3 is a T H 1-specific type 1 membrane protein and regulates T H 1 proliferation and the development of tolerance. Binding of galectin-9 to the extracellular domain of Tim-3 results in apoptosis of T H 1 cells, but the intracellular pathways involved in the regulatory function of Tim-3 are unknown. Unlike Tim-1, which is expressed in renal epithelia and cancer, Tim-3 has not been described in cells other than neuronal or T-cells. Using RT-PCR we demonstrate that Tim-3 is expressed in malignant and non-malignant epithelial tissues. We have cloned Tim-3 from an immortalized liver cell carcinoma line and identified a highly conserved tyrosine in the intracellular tail of Tim-3 (Y265). We demonstrate that Y265 is specifically phosphorylated in vivo by the interleukin inducible T cell kinase (ITK), a kinase which is located in close proximity of the TIM genes on the allergy susceptibility locus 5q33.3. Stimulation of Tim-3 by its ligand galectin-9 results in increased phosphorylation of Y265, suggesting that this tyrosine residue plays an important role in downstream signalling events regulating T-cell fate. Given the role of TIM proteins in autoimmunity and cancer, the conserved SH2 binding domain surrounding Y265 could represent a possible target site for pharmacological intervention

  14. Shielding analysis of high level waste water storage facilities using MCNP code

    Energy Technology Data Exchange (ETDEWEB)

    Yabuta, Naohiro [Mitsubishi Research Inst., Inc., Tokyo (Japan)

    2001-01-01

    The neutron and gamma-ray transport analysis for the facility as a reprocessing facility with large buildings having thick shielding was made. Radiation shielding analysis consists of a deep transmission calculation for the concrete wall and a skyshine calculation for the space out of the buildings. An efficient analysis with a short running time and high accuracy needs a variance reduction technique suitable for all the calculation regions and structures. In this report, the shielding analysis using MCNP and a discrete ordinate transport code is explained and the idea and procedure of decision of variance reduction parameter is completed. (J.P.N.)

  15. An assessment of high carbon stock and high conservation value approaches to sustainable oil palm cultivation in Gabon

    Science.gov (United States)

    Austin, Kemen G.; Lee, Michelle E.; Clark, Connie; Forester, Brenna R.; Urban, Dean L.; White, Lee; Kasibhatla, Prasad S.; Poulsen, John R.

    2017-01-01

    Industrial-scale oil palm cultivation is rapidly expanding in Gabon, where it has the potential to drive economic growth, but also threatens forest, biodiversity and carbon resources. The Gabonese government is promoting an ambitious agricultural expansion strategy, while simultaneously committing to minimize negative environmental impacts of oil palm agriculture. This study estimates the extent and location of suitable land for oil palm cultivation in Gabon, based on an analysis of recent trends in plantation permitting. We use the resulting suitability map to evaluate two proposed approaches to minimizing negative environmental impacts: a High Carbon Stock (HCS) approach, which emphasizes forest protection and climate change mitigation, and a High Conservation Value (HCV) approach, which focuses on safeguarding biodiversity and ecosystems. We quantify the forest area, carbon stock, and biodiversity resources protected under each approach, using newly developed maps of priority species distributions and forest biomass for Gabon. We find 2.7-3.9 Mha of suitable or moderately suitable land that avoid HCS areas, 4.4 million hectares (Mha) that avoid HCV areas, and 1.2-1.7 Mha that avoid both. This suggests that Gabon’s oil palm production target could likely be met without compromising important ecosystem services, if appropriate safeguards are put in place. Our analysis improves understanding of suitability for oil palm in Gabon, determines how conservation strategies align with national targets for oil palm production, and informs national land use planning.

  16. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  17. Comparative transcriptome analysis within the Lolium/Festuca species complex reveals high sequence conservation

    DEFF Research Database (Denmark)

    Czaban, Adrian; Sharma, Sapna; Byrne, Stephen

    2015-01-01

    species from the Lolium-Festuca complex, ranging from 52,166 to 72,133 transcripts per assembly. We have also predicted a set of proteins and validated it with a high-confidence protein database from three closely related species (H. vulgare, B. distachyon and O. sativa). We have obtained gene family...... clusters for the four species using OrthoMCL and analyzed their inferred phylogenetic relationships. Our results indicate that VRN2 is a candidate gene for differentiating vernalization and non-vernalization types in the Lolium-Festuca complex. Grouping of the gene families based on their BLAST identity...... enabled us to divide ortholog groups into those that are very conserved and those that are more evolutionarily relaxed. The ratio of the non-synonumous to synonymous substitutions enabled us to pinpoint protein sequences evolving in response to positive selection. These proteins may explain some...

  18. Cortical cytasters: a highly conserved developmental trait of Bilateria with similarities to Ctenophora

    Directory of Open Access Journals (Sweden)

    Salinas-Saavedra Miguel

    2011-12-01

    Full Text Available Abstract Background Cytasters (cytoplasmic asters are centriole-based nucleation centers of microtubule polymerization that are observable in large numbers in the cortical cytoplasm of the egg and zygote of bilaterian organisms. In both protostome and deuterostome taxa, cytasters have been described to develop during oogenesis from vesicles of nuclear membrane that move to the cortical cytoplasm. They become associated with several cytoplasmic components, and participate in the reorganization of cortical cytoplasm after fertilization, patterning the antero-posterior and dorso-ventral body axes. Presentation of the hypothesis The specific resemblances in the development of cytasters in both protostome and deuterostome taxa suggest that an independent evolutionary origin is unlikely. An assessment of published data confirms that cytasters are present in several protostome and deuterostome phyla, but are absent in the non-bilaterian phyla Cnidaria and Ctenophora. We hypothesize that cytasters evolved in the lineage leading to Bilateria and were already present in the most recent common ancestor shared by protostomes and deuterostomes. Thus, cytasters would be an ancient and highly conserved trait that is homologous across the different bilaterian phyla. The alternative possibility is homoplasy, that is cytasters have evolved independently in different lineages of Bilateria. Testing the hypothesis So far, available published information shows that appropriate observations have been made in eight different bilaterian phyla. All of them present cytasters. This is consistent with the hypothesis of homology and conservation. However, there are several important groups for which there are no currently available data. The hypothesis of homology predicts that cytasters should be present in these groups. Increasing the taxonomic sample using modern techniques uniformly will test for evolutionary patterns supporting homology, homoplasy, or secondary loss of

  19. Sequence of cDNAs for mammalian H2A. Z, an evolutionarily diverged but highly conserved basal histone H2A isoprotein species

    Energy Technology Data Exchange (ETDEWEB)

    Hatch, C L; Bonner, W M

    1988-02-11

    The nucleotide sequences of cDNAs for the evolutionarily diverged but highly conserved basal H2A isoprotein, H2A.Z, have been determined for the rat, cow, and human. As a basal histone, H2A.Z is synthesized throughout the cell cycle at a constant rate, unlinked to DNA replication, and at a much lower rate in quiescent cells. Each of the cDNA isolates encodes the entire H2A.Z polypeptide. The human isolate is about 1.0 kilobases long. It contains a coding region of 387 nucleotides flanked by 106 nucleotides of 5'UTR and 376 nucleotides of 3'UTR, which contains a polyadenylation signal followed by a poly A tail. The bovine and rat cDNAs have 97 and 94% nucleotide positional identity to the human cDNA in the coding region and 98% in the proximal 376 nucleotides of the 3'UTR which includes the polyadenylation signal. A potential stem-forming sequence imbedded in a direct repeat is found centered at 261 nucleotides into the 3'UTR. Each of the cDNA clones could be transcribed and translated in vitro to yield H2A.Z protein. The mammalian H2A.Z cDNA coding sequences are approximately 80% similar to those in chicken and 75% to those in sea urchin.

  20. A highly conserved amino acid in VP1 regulates maturation of enterovirus 71.

    Directory of Open Access Journals (Sweden)

    Yong-Xin Zhang

    2017-09-01

    Full Text Available Enterovirus 71 (EV71 is the major causative agent of hand, foot and mouth disease (HFMD in children, causing severe clinical outcomes and even death. Here, we report an important role of the highly conserved alanine residue at position 107 in the capsid protein VP1 (VP1A107 in the efficient replication of EV71. Substitutional mutations of VP1A107 significantly diminish viral growth kinetics without significant effect on viral entry, expression of viral genes and viral production. The results of mechanistic studies reveal that VP1A107 regulates the efficient cleavage of the VP0 precursor during EV71 assembly, which is required, in the next round of infection, for the transformation of the mature virion (160S into an intermediate or A-particle (135S, a key step of virus uncoating. Furthermore, the results of molecular dynamic simulations and hydrogen-bond networks analysis of VP1A107 suggest that flexibility of the VP1 BC loop or the region surrounding the VP1107 residue directly correlates with viral infectivity. It is possible that sufficient flexibility of the region surrounding the VP1107 residue favors VP0 conformational change that is required for the efficient cleavage of VP0 as well as subsequent viral uncoating and viral replication. Taken together, our data reveal the structural role of the highly conserved VP1A107 in regulating EV71 maturation. Characterization of this novel determinant of EV71 virulence would promote the study on pathogenesis of Enteroviruses.

  1. CpG methylation differences between neurons and glia are highly conserved from mouse to human.

    Science.gov (United States)

    Kessler, Noah J; Van Baak, Timothy E; Baker, Maria S; Laritsky, Eleonora; Coarfa, Cristian; Waterland, Robert A

    2016-01-15

    Understanding epigenetic differences that distinguish neurons and glia is of fundamental importance to the nascent field of neuroepigenetics. A recent study used genome-wide bisulfite sequencing to survey differences in DNA methylation between these two cell types, in both humans and mice. That study minimized the importance of cell type-specific differences in CpG methylation, claiming these are restricted to localized genomic regions, and instead emphasized that widespread and highly conserved differences in non-CpG methylation distinguish neurons and glia. We reanalyzed the data from that study and came to markedly different conclusions. In particular, we found widespread cell type-specific differences in CpG methylation, with a genome-wide tendency for neuronal CpG-hypermethylation punctuated by regions of glia-specific hypermethylation. Alarmingly, our analysis indicated that the majority of genes identified by the primary study as exhibiting cell type-specific CpG methylation differences were misclassified. To verify the accuracy of our analysis, we isolated neuronal and glial DNA from mouse cortex and performed quantitative bisulfite pyrosequencing at nine loci. The pyrosequencing results corroborated our analysis, without exception. Most interestingly, we found that gene-associated neuron vs. glia CpG methylation differences are highly conserved across human and mouse, and are very likely to be functional. In addition to underscoring the importance of independent verification to confirm the conclusions of genome-wide epigenetic analyses, our data indicate that CpG methylation plays a major role in neuroepigenetics, and that the mouse is likely an excellent model in which to study the role of DNA methylation in human neurodevelopment and disease. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Evolutionary conservation of essential and highly expressed genes in Pseudomonas aeruginosa

    Directory of Open Access Journals (Sweden)

    Scharfe Maren

    2010-04-01

    Full Text Available Abstract Background The constant increase in development and spread of bacterial resistance to antibiotics poses a serious threat to human health. New sequencing technologies are now on the horizon that will yield massive increases in our capacity for DNA sequencing and will revolutionize the drug discovery process. Since essential genes are promising novel antibiotic targets, the prediction of gene essentiality based on genomic information has become a major focus. Results In this study we demonstrate that pooled sequencing is applicable for the analysis of sequence variations of strain collections with more than 10 individual isolates. Pooled sequencing of 36 clinical Pseudomonas aeruginosa isolates revealed that essential and highly expressed proteins evolve at lower rates, whereas extracellular proteins evolve at higher rates. We furthermore refined the list of experimentally essential P. aeruginosa genes, and identified 980 genes that show no sequence variation at all. Among the conserved nonessential genes we found several that are involved in regulation, motility and virulence, indicating that they represent factors of evolutionary importance for the lifestyle of a successful environmental bacterium and opportunistic pathogen. Conclusion The detailed analysis of a comprehensive set of P. aeruginosa genomes in this study clearly disclosed detailed information of the genomic makeup and revealed a large set of highly conserved genes that play an important role for the lifestyle of this microorganism. Sequencing strain collections enables for a detailed and extensive identification of sequence variations as potential bacterial adaptation processes, e.g., during the development of antibiotic resistance in the clinical setting and thus may be the basis to uncover putative targets for novel treatment strategies.

  3. A guide to calculating habitat-quality metrics to inform conservation of highly mobile species

    Science.gov (United States)

    Bieri, Joanna A.; Sample, Christine; Thogmartin, Wayne E.; Diffendorfer, James E.; Earl, Julia E.; Erickson, Richard A.; Federico, Paula; Flockhart, D. T. Tyler; Nicol, Sam; Semmens, Darius J.; Skraber, T.; Wiederholt, Ruscena; Mattsson, Brady J.

    2018-01-01

    Many metrics exist for quantifying the relative value of habitats and pathways used by highly mobile species. Properly selecting and applying such metrics requires substantial background in mathematics and understanding the relevant management arena. To address this multidimensional challenge, we demonstrate and compare three measurements of habitat quality: graph-, occupancy-, and demographic-based metrics. Each metric provides insights into system dynamics, at the expense of increasing amounts and complexity of data and models. Our descriptions and comparisons of diverse habitat-quality metrics provide means for practitioners to overcome the modeling challenges associated with management or conservation of such highly mobile species. Whereas previous guidance for applying habitat-quality metrics has been scattered in diversified tracks of literature, we have brought this information together into an approachable format including accessible descriptions and a modeling case study for a typical example that conservation professionals can adapt for their own decision contexts and focal populations.Considerations for Resource ManagersManagement objectives, proposed actions, data availability and quality, and model assumptions are all relevant considerations when applying and interpreting habitat-quality metrics.Graph-based metrics answer questions related to habitat centrality and connectivity, are suitable for populations with any movement pattern, quantify basic spatial and temporal patterns of occupancy and movement, and require the least data.Occupancy-based metrics answer questions about likelihood of persistence or colonization, are suitable for populations that undergo localized extinctions, quantify spatial and temporal patterns of occupancy and movement, and require a moderate amount of data.Demographic-based metrics answer questions about relative or absolute population size, are suitable for populations with any movement pattern, quantify demographic

  4. High-dimensional free-space optical communications based on orbital angular momentum coding

    Science.gov (United States)

    Zou, Li; Gu, Xiaofan; Wang, Le

    2018-03-01

    In this paper, we propose a high-dimensional free-space optical communication scheme using orbital angular momentum (OAM) coding. In the scheme, the transmitter encodes N-bits information by using a spatial light modulator to convert a Gaussian beam to a superposition mode of N OAM modes and a Gaussian mode; The receiver decodes the information through an OAM mode analyser which consists of a MZ interferometer with a rotating Dove prism, a photoelectric detector and a computer carrying out the fast Fourier transform. The scheme could realize a high-dimensional free-space optical communication, and decodes the information much fast and accurately. We have verified the feasibility of the scheme by exploiting 8 (4) OAM modes and a Gaussian mode to implement a 256-ary (16-ary) coding free-space optical communication to transmit a 256-gray-scale (16-gray-scale) picture. The results show that a zero bit error rate performance has been achieved.

  5. DCHAIN-SP 2001: High energy particle induced radioactivity calculation code

    Energy Technology Data Exchange (ETDEWEB)

    Kai, Tetsuya; Maekawa, Fujio; Kasugai, Yoshimi; Takada, Hiroshi; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kosako, Kazuaki [Sumitomo Atomic Energy Industries, Ltd., Tokyo (Japan)

    2001-03-01

    For the purpose of contribution to safety design calculations for induced radioactivities in the JAERI/KEK high-intensity proton accelerator project facilities, the DCHAIN-SP which calculates the high energy particle induced radioactivity has been updated to DCHAIN-SP 2001. The following three items were improved: (1) Fission yield data are included to apply the code to experimental facility design for nuclear transmutation of long-lived radioactive waste where fissionable materials are treated. (2) Activation cross section data below 20 MeV are revised. In particular, attentions are paid to cross section data of materials which have close relation to the facilities, i.e., mercury, lead and bismuth, and to tritium production cross sections which are important in terms of safety of the facilities. (3) User-interface for input/output data is sophisticated to perform calculations more efficiently than that in the previous version. Information needed for use of the code is attached in Appendices; the DCHAIN-SP 2001 manual, the procedures of installation and execution of DCHAIN-SP, and sample problems. (author)

  6. An Effective Transform Unit Size Decision Method for High Efficiency Video Coding

    Directory of Open Access Journals (Sweden)

    Chou-Chen Wang

    2014-01-01

    Full Text Available High efficiency video coding (HEVC is the latest video coding standard. HEVC can achieve higher compression performance than previous standards, such as MPEG-4, H.263, and H.264/AVC. However, HEVC requires enormous computational complexity in encoding process due to quadtree structure. In order to reduce the computational burden of HEVC encoder, an early transform unit (TU decision algorithm (ETDA is adopted to pruning the residual quadtree (RQT at early stage based on the number of nonzero DCT coefficients (called NNZ-EDTA to accelerate the encoding process. However, the NNZ-ETDA cannot effectively reduce the computational load for sequences with active motion or rich texture. Therefore, in order to further improve the performance of NNZ-ETDA, we propose an adaptive RQT-depth decision for NNZ-ETDA (called ARD-NNZ-ETDA by exploiting the characteristics of high temporal-spatial correlation that exist in nature video sequences. Simulation results show that the proposed method can achieve time improving ratio (TIR about 61.26%~81.48% when compared to the HEVC test model 8.1 (HM 8.1 with insignificant loss of image quality. Compared with the NNZ-ETDA, the proposed method can further achieve an average TIR about 8.29%~17.92%.

  7. Bar Code Medication Administration Technology: Characterization of High-Alert Medication Triggers and Clinician Workarounds.

    Science.gov (United States)

    Miller, Daniel F; Fortier, Christopher R; Garrison, Kelli L

    2011-02-01

    Bar code medication administration (BCMA) technology is gaining acceptance for its ability to prevent medication administration errors. However, studies suggest that improper use of BCMA technology can yield unsatisfactory error prevention and introduction of new potential medication errors. To evaluate the incidence of high-alert medication BCMA triggers and alert types and discuss the type of nursing and pharmacy workarounds occurring with the use of BCMA technology and the electronic medication administration record (eMAR). Medication scanning and override reports from January 1, 2008, through November 30, 2008, for all adult medical/surgical units were retrospectively evaluated for high-alert medication system triggers, alert types, and override reason documentation. An observational study of nursing workarounds on an adult medicine step-down unit was performed and an analysis of potential pharmacy workarounds affecting BCMA and the eMAR was also conducted. Seventeen percent of scanned medications triggered an error alert of which 55% were for high-alert medications. Insulin aspart, NPH insulin, hydromorphone, potassium chloride, and morphine were the top 5 high-alert medications that generated alert messages. Clinician override reasons for alerts were documented in only 23% of administrations. Observational studies assessing for nursing workarounds revealed a median of 3 clinician workarounds per administration. Specific nursing workarounds included a failure to scan medications/patient armband and scanning the bar code once the dosage has been removed from the unit-dose packaging. Analysis of pharmacy order entry process workarounds revealed the potential for missed doses, duplicate doses, and doses being scheduled at the wrong time. BCMA has the potential to prevent high-alert medication errors by alerting clinicians through alert messages. Nursing and pharmacy workarounds can limit the recognition of optimal safety outcomes and therefore workflow processes

  8. Habitat Re-Creation (Ecological Restoration) as a Strategy for Conserving Insect Communities in Highly Fragmented Landscapes.

    Science.gov (United States)

    Shuey, John A

    2013-12-05

    Because of their vast diversity, measured by both numbers of species as well as life history traits, insects defy comprehensive conservation planning. Thus, almost all insect conservation efforts target individual species. However, serious insect conservation requires goals that are set at the faunal level and conservation success requires strategies that conserve intact communities. This task is complicated in agricultural landscapes by high levels of habitat fragmentation and isolation. In many regions, once widespread insect communities are now functionally trapped on islands of ecosystem remnants and subject to a variety of stressors associated with isolation, small population sizes and artificial population fragmentation. In fragmented landscapes ecological restoration can be an effective strategy for reducing localized insect extinction rates, but insects are seldom included in restoration design criteria. It is possible to incorporate a few simple conservation criteria into restoration designs that enhance impacts to entire insect communities. Restoration can be used as a strategy to address fragmentation threats to isolated insect communities if insect communities are incorporated at the onset of restoration planning. Fully incorporating insect communities into restoration designs may increase the cost of restoration two- to three-fold, but the benefits to biodiversity conservation and the ecological services provided by intact insect communities justify the cost.

  9. Butterflies of the high altitude Atacama Desert: habitat use and conservation

    Directory of Open Access Journals (Sweden)

    Emma eDespland

    2014-09-01

    Full Text Available The butterfly fauna of the high-altitude desert of Northern Chile, though depauperate, shows high endemism, is poorly known and is of considerable conservation concern. This study surveys butterflies along the Andean slope between 2400 and 500 m asl (prepuna, puna and Andean steppe habitats as well as in high and low altitude wetlands and in the neoriparian vegetation of agricultural sites. We also include historical sightings from museum records. We compare abundances between altitudes, between natural and impacted sites, as well as between two sampling years with different precipitation regimes. The results confirm high altitudinal turnover and show greatest similarity between wetland and slope faunas at similar altitudes. Results also underscore vulnerability to weather fluctuations, particularly in the more arid low-altitude sites, where abundances were much lower in the low precipitation sampling season and several species were not observed at all. Finally, we show that some species have shifted to the neoriparian vegetation of the agricultural landscape, whereas others were only observed in less impacted habitats dominated by native plants. These results suggest that acclimation to novel habitats depends on larval host plant use. The traditional agricultural environment can provide habitat for many, but not all, native butterfly species, but an estimation of the value of these habitats requires better understanding of butterfly life-history strategies and relationships with host plants.

  10. Review of high energy data and model codes for accelerator-based transmutation

    International Nuclear Information System (INIS)

    Koning, A.J.

    1993-01-01

    After reviewing the most important data needs for accelerator-based transmutation, the present status of the collection of experimental data for high energies is investigated by scanning the two databases NSR and EXFOR for measured cross sections. The most important nuclear theories and some of the associated nuclear model codes that are in use are outlined. Experimental data and simple theories have been used to construct empirical fomulae for the prediction of high-energy cross sections and these parametrizations are listed. A survey is given of the evaluation work that has been done so far, and finally some conclusions and recommendations are presented, with respect to the need of compilation of experimental data. (orig.)

  11. High performance reconciliation for continuous-variable quantum key distribution with LDPC code

    Science.gov (United States)

    Lin, Dakai; Huang, Duan; Huang, Peng; Peng, Jinye; Zeng, Guihua

    2015-03-01

    Reconciliation is a significant procedure in a continuous-variable quantum key distribution (CV-QKD) system. It is employed to extract secure secret key from the resulted string through quantum channel between two users. However, the efficiency and the speed of previous reconciliation algorithms are low. These problems limit the secure communication distance and the secure key rate of CV-QKD systems. In this paper, we proposed a high-speed reconciliation algorithm through employing a well-structured decoding scheme based on low density parity-check (LDPC) code. The complexity of the proposed algorithm is reduced obviously. By using a graphics processing unit (GPU) device, our method may reach a reconciliation speed of 25 Mb/s for a CV-QKD system, which is currently the highest level and paves the way to high-speed CV-QKD.

  12. A parallelization study of the general purpose Monte Carlo code MCNP4 on a distributed memory highly parallel computer

    International Nuclear Information System (INIS)

    Yamazaki, Takao; Fujisaki, Masahide; Okuda, Motoi; Takano, Makoto; Masukawa, Fumihiro; Naito, Yoshitaka

    1993-01-01

    The general purpose Monte Carlo code MCNP4 has been implemented on the Fujitsu AP1000 distributed memory highly parallel computer. Parallelization techniques developed and studied are reported. A shielding analysis function of the MCNP4 code is parallelized in this study. A technique to map a history to each processor dynamically and to map control process to a certain processor was applied. The efficiency of parallelized code is up to 80% for a typical practical problem with 512 processors. These results demonstrate the advantages of a highly parallel computer to the conventional computers in the field of shielding analysis by Monte Carlo method. (orig.)

  13. Cosmetic results in early stage breast cancer patients with high-dose brachytherapy after conservative surgery

    International Nuclear Information System (INIS)

    Torres, Felipe; Pineda, Beatriz E

    2004-01-01

    Purpose: to reveal cosmetic results in patients at early stages of low risk breast cancer treated with partial accelerated radiotherapy using high dose rate brachytherapy. Methods and materials: from March 2001 to July 2003,14 stages l and ll breast cancer patients were treated at the Colombian national cancer institute in Bogota with conservative surgery and radiotherapy upon the tumor bed (partial accelerated radiotherapy), using interstitial implants with iridium 192 (high dose rate brachytherapy) with a dose of 32 Gys, over 4 days, at 8 fractions twice a day. Results: with an average follow up of 17.7 months, good cosmetic results were found among 71.4 % of patients and excellent results among 14.3% of patients, furthermore none of the patients neither local nor regional or distant relapses. Conclusion: among patients who suffer from breast cancer at early stages, it showed is possible to apply partial accelerated radiotherapy upon the tumor bed with high doses over 4 days with good to excellent cosmetic results

  14. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    Science.gov (United States)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  15. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  16. Test Particle Simulations of Electron Injection by the Bursty Bulk Flows (BBFs) using High Resolution Lyon-Feddor-Mobarry (LFM) Code

    Science.gov (United States)

    Eshetu, W. W.; Lyon, J.; Wiltberger, M. J.; Hudson, M. K.

    2017-12-01

    Test particle simulations of electron injection by the bursty bulk flows (BBFs) have been done using a test particle tracer code [1], and the output fields of the Lyon-Feddor-Mobarry global magnetohydro- dynamics (MHD) code[2]. The MHD code was run with high resolu- tion (oct resolution), and with specified solar wind conditions so as to reproduce the observed qualitative picture of the BBFs [3]. Test par- ticles were injected so that they interact with earthward propagating BBFs. The result of the simulation shows that electrons are pushed ahead of the BBFs and accelerated into the inner magnetosphere. Once electrons are in the inner magnetosphere they are further energized by drift resonance with the azimuthal electric field. In addition pitch angle scattering of electrons resulting in the violation conservation of the first adiabatic invariant has been observed. The violation of the first adiabatic invariant occurs as electrons cross a weak magnetic field region with a strong gradient of the field perturbed by the BBFs. References 1. Kress, B. T., Hudson,M. K., Looper, M. D. , Albert, J., Lyon, J. G., and Goodrich, C. C. (2007), Global MHD test particle simulations of ¿ 10 MeV radiation belt electrons during storm sudden commencement, J. Geophys. Res., 112, A09215, doi:10.1029/2006JA012218. Lyon,J. G., Fedder, J. A., and Mobarry, C.M., The Lyon- Fedder-Mobarry (LFM) Global MHD Magnetospheric Simulation Code (2004), J. Atm. And Solar-Terrestrial Phys., 66, Issue 15-16, 1333- 1350,doi:10.1016/j.jastp. Wiltberger, Merkin, M., Lyon, J. G., and Ohtani, S. (2015), High-resolution global magnetohydrodynamic simulation of bursty bulk flows, J. Geophys. Res. Space Physics, 120, 45554566, doi:10.1002/2015JA021080.

  17. Secure Communications in High Speed Fiber Optical Networks Using Code Division Multiple Access (CDMA) Transmission

    Energy Technology Data Exchange (ETDEWEB)

    Han, I; Bond, S; Welty, R; Du, Y; Yoo, S; Reinhardt, C; Behymer, E; Sperry, V; Kobayashi, N

    2004-02-12

    This project is focused on the development of advanced components and system technologies for secure data transmission on high-speed fiber optic data systems. This work capitalizes on (1) a strong relationship with outstanding faculty at the University of California-Davis who are experts in high speed fiber-optic networks, (2) the realization that code division multiple access (CDMA) is emerging as a bandwidth enhancing technique for fiber optic networks, (3) the realization that CDMA of sufficient complexity forms the basis for almost unbreakable one-time key transmissions, (4) our concepts for superior components for implementing CDMA, (5) our expertise in semiconductor device processing and (6) our Center for Nano and Microtechnology, which is where the majority of the experimental work was done. Here we present a novel device concept, which will push the limits of current technology, and will simultaneously solve system implementation issues by investigating new state-of-the-art fiber technologies. This will enable the development of secure communication systems for the transmission and reception of messages on deployed commercial fiber optic networks, through the CDMA phase encoding of broad bandwidth pulses. CDMA technology has been developed as a multiplexing technology, much like wavelength division multiplexing (WDM) or time division multiplexing (TDM), to increase the potential number of users on a given communication link. A novel application of the techniques created for CDMA is to generate secure communication through physical layer encoding. Physical layer encoding devices are developed which utilize semiconductor waveguides with fast carrier response times to phase encode spectral components of a secure signal. Current commercial technology, most commonly a spatial light modulator, allows phase codes to be changed at rates of only 10's of Hertz ({approx}25ms response). The use of fast (picosecond to nanosecond) carrier dynamics of semiconductors

  18. A Common Histone Modification Code on C4 Genes in Maize and Its Conservation in Sorghum and Setaria italica1[W][OA

    Science.gov (United States)

    Heimann, Louisa; Horst, Ina; Perduns, Renke; Dreesen, Björn; Offermann, Sascha; Peterhansel, Christoph

    2013-01-01

    C4 photosynthesis evolved more than 60 times independently in different plant lineages. Each time, multiple genes were recruited into C4 metabolism. The corresponding promoters acquired new regulatory features such as high expression, light induction, or cell type-specific expression in mesophyll or bundle sheath cells. We have previously shown that histone modifications contribute to the regulation of the model C4 phosphoenolpyruvate carboxylase (C4-Pepc) promoter in maize (Zea mays). We here tested the light- and cell type-specific responses of three selected histone acetylations and two histone methylations on five additional C4 genes (C4-Ca, C4-Ppdk, C4-Me, C4-Pepck, and C4-RbcS2) in maize. Histone acetylation and nucleosome occupancy assays indicated extended promoter regions with regulatory upstream regions more than 1,000 bp from the transcription initiation site for most of these genes. Despite any detectable homology of the promoters on the primary sequence level, histone modification patterns were highly coregulated. Specifically, H3K9ac was regulated by illumination, whereas H3K4me3 was regulated in a cell type-specific manner. We further compared histone modifications on the C4-Pepc and C4-Me genes from maize and the homologous genes from sorghum (Sorghum bicolor) and Setaria italica. Whereas sorghum and maize share a common C4 origin, C4 metabolism evolved independently in S. italica. The distribution of histone modifications over the promoters differed between the species, but differential regulation of light-induced histone acetylation and cell type-specific histone methylation were evident in all three species. We propose that a preexisting histone code was recruited into C4 promoter control during the evolution of C4 metabolism. PMID:23564230

  19. Swertia chirayta, a Threatened High-Value Medicinal Herb: Microhabitats and Conservation Challenges in Sikkim Himalaya, India

    Directory of Open Access Journals (Sweden)

    Bharat Kumar Pradhan

    2015-11-01

    Full Text Available Assessing the impact of threats, identifying favorable growing conditions, and predicting future population scenarios are vital for the conservation and management of threatened species. This study investigated the availability, microhabitat characteristics, threat status, and community associations of Swertia chirayta, a highly threatened Himalayan medicinal herb, in 22 populations in Sikkim, India, using the vertical belt transect method. Of the 14 microhabitats identified, open grassy slope emerged as the most favorable and wet grassy slope as the least favorable for S. chirayta. The species was dominant in 8 of the 10 major plant communities identified. Among 9 major types of disturbance identified, human movement and collection of non-timber forest products appeared as the biggest threats to S. chirayta. Disturbances significantly affected the availability of the species. S. chirayta, though under high anthropogenic threat, maintains high microhabitat pliability, which is vital for its conservation and management, provided immediate conservation measures are taken.

  20. Evaluation of highly conserved hsp65-specific nested PCR primers for diagnosing Mycobacterium tuberculosis.

    Science.gov (United States)

    Priyadarshini, P; Tiwari, K; Das, A; Kumar, D; Mishra, M N; Desikan, P; Nath, G

    2017-02-01

    To evaluate the sensitivity and specificity of a new nested set of primers designed for the detection of Mycobacterium tuberculosis complex targeting a highly conserved heat shock protein gene (hsp65). The nested primers were designed using multiple sequence alignment assuming the nucleotide sequence of the M. tuberculosis H37Rv hsp65 genome as base. Multidrug-resistant Mycobacterium species along with other non-mycobacterial and fungal species were included to evaluate the specificity of M. tuberculosis hsp65 gene-specific primers. The sensitivity of the primers was determined using serial 10-fold dilutions, and was 100% as shown by the bands in the case of M. tuberculosis complex. None of the other non M. tuberculosis complex bacterial and fungal species yielded any band on nested polymerase chain reaction (PCR). The first round of amplification could amplify 0.3 ng of the template DNA, while nested PCR could detect 0.3 pg. The present hsp65-specific primers have been observed to be sensitive, specific and cost-effective, without requiring interpretation of biochemical tests, real-time PCR, sequencing or high-performance liquid chromatography. These primer sets do not have the drawbacks associated with those protocols that target insertion sequence 6110, 16S rDNA, rpoB, recA and MPT 64.

  1. Nifty Native Implemented Functions: low-level meets high-level code

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Erlang Native Implemented Functions (NIFs) allow developers to implement functions in C (or C++) rather than Erlang. NIFs are useful for integrating high performance or legacy code in Erlang applications. The talk will cover how to implement NIFs, use cases, and common pitfalls when employing them. Further, we will discuss how and why Erlang applications, such as Riak, use NIFs. About the speaker Ian Plosker is the Technical Lead, International Operations at Basho Technologies, the makers of the open source database Riak. He has been developing software professionally for 10 years and programming since childhood. Prior to working at Basho, he developed everything from CMS to bioinformatics platforms to corporate competitive intelligence management systems. At Basho, he's been helping customers be incredibly successful using Riak.

  2. Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding

    Science.gov (United States)

    Dung, Lan-Rong; Lin, Meng-Chun

    This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.

  3. Development of a code and models for high burnup fuel performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kinoshita, M; Kitajima, S [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1997-08-01

    First the high burnup LWR fuel behavior is discussed and necessary models for the analysis are reviewed. These aspects of behavior are the changes of power history due to the higher enrichment, the temperature feedback due to fission gas release and resultant degradation of gap conductance, axial fission gas transport in fuel free volume, fuel conductivity degradation due to fission product solution and modification of fuel micro-structure. Models developed for these phenomena, modifications in the code, and the benchmark results mainly based on Risoe fission gas project is presented. Finally the rim effect which is observe only around the fuel periphery will be discussed focusing into the fuel conductivity degradation and swelling due to the porosity development. (author). 18 refs, 13 figs, 3 tabs.

  4. Random Linear Network Coding is Key to Data Survival in Highly Dynamic Distributed Storage

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    Distributed storage solutions have become widespread due to their ability to store large amounts of data reliably across a network of unreliable nodes, by employing repair mechanisms to prevent data loss. Conventional systems rely on static designs with a central control entity to oversee...... and control the repair process. Given the large costs for maintaining and cooling large data centers, our work proposes and studies the feasibility of a fully decentralized systems that can store data even on unreliable and, sometimes, unavailable mobile devices. This imposes new challenges on the design...... as the number of available nodes varies greatly over time and keeping track of the system's state becomes unfeasible. As a consequence, conventional erasure correction approaches are ill-suited for maintaining data integrity. In this highly dynamic context, random linear network coding (RLNC) provides...

  5. Simulation for photon detection in spectrometric system of high purity (HPGe) using MCNPX code

    International Nuclear Information System (INIS)

    Correa, Guilherme Jorge de Souza

    2013-01-01

    The Brazilian National Commission of Nuclear Energy defines parameters for classification and management of radioactive waste in accordance with the activity of materials. The efficiency of a detection system is crucial to determine the real activity of a radioactive source. When it's possible, the system's calibration should be performed using a standard source. Unfortunately, there are only a few cases that it can be done this way, considering the difficulty of obtaining appropriate standard sources for each type of measurement. So, computer simulations can be performed to assist in calculating of the efficiency of the system and, consequently, also auxiliary the classification of radioactive waste. This study aims to model a high purity germanium (HPGe) detector with MCNPX code, approaching the spectral values computationally obtained of the values experimentally obtained for the photopeak of 137 Cs. The approach will be made through changes in outer dead layer of the germanium crystal modeled. (author)

  6. Coded aperture detector for high precision gamma-ray burst source locations

    International Nuclear Information System (INIS)

    Helmken, H.; Gorenstein, P.

    1977-01-01

    Coded aperture collimators in conjunction with position-sensitive detectors are very useful in the study of transient phenomenon because they combine broad field of view, high sensitivity, and an ability for precise source locations. Since the preceeding conference, a series of computer simulations of various detector designs have been carried out with the aid of a CDC 6400. Particular emphasis was placed on the development of a unit consisting of a one-dimensional random or periodic collimator in conjunction with a two-dimensional position-sensitive Xenon proportional counter. A configuration involving four of these units has been incorporated into the preliminary design study of the Transient Explorer (ATREX) satellite and are applicable to any SAS or HEAO type satellite mission. Results of this study, including detector response, fields of view, and source location precision, will be presented

  7. A highly conserved basidiomycete peptide synthetase produces a trimeric hydroxamate siderophore.

    Science.gov (United States)

    Brandenburger, Eileen; Gressler, Markus; Leonhardt, Robin; Lackner, Gerald; Habel, Andreas; Hertweck, Christian; Brock, Matthias; Hoffmeister, Dirk

    2017-08-25

    The model white-rot basidiomycete Ceriporiopsis ( Gelatoporia ) subvermispora B encodes putative natural product biosynthesis genes. Among them is the gene for the seven-domain nonribosomal peptide synthetase CsNPS2. It is a member of the as-yet uncharacterized fungal type VI siderophore synthetase family which is highly conserved and widely distributed among the basidiomycetes. These enzymes include only one adenylation (A) domain, i.e., one complete peptide synthetase module and two thiolation/condensation (T-C) di-domain partial modules which, together, constitute an AT 1 C 1 T 2 C 2 T 3 C 3 domain setup. The full-length CsNPS2 enzyme (274.5 kDa) was heterologously produced as polyhistidine fusion in Aspergillus niger as soluble and active protein. N 5 -acetyl- N 5 -hydroxy-l-ornithine (l-AHO) and N 5 - cis -anhydromevalonyl- N 5 -hydroxy-l-ornithine (l-AMHO) were accepted as substrates, as assessed in vitro using the substrate-dependent [ 32 P]ATP-pyrophosphate radioisotope exchange assay. Full-length holo -CsNPS2 catalyzed amide bond formation between three l-AHO molecules to release the linear l-AHO trimer, called basidioferrin, as product in vitro , which was verified by LC-HRESIMS. Phylogenetic analyses suggest that type VI family siderophore synthetases are widespread in mushrooms and have evolved in a common ancestor of basidiomycetes. Importance : The basidiomycete nonribosomal peptide synthetase CsNPS2 represents a member of a widely distributed but previously uninvestigated class (type VI) of fungal siderophore synthetases. Genes orthologous to CsNPS2 are highly conserved across various phylogenetic clades of the basidiomycetes. Hence, our work serves as a broadly applicable model for siderophore biosynthesis and iron metabolism in higher fungi. Also, our results on the amino acid substrate preference of CsNPS2 supports further understanding of the substrate selectivity of fungal adenylation domains. Methodologically, this report highlights the

  8. An explication of the Graphite Structural Design Code of core components for the High Temperature Engineering Test Reactor

    International Nuclear Information System (INIS)

    Iyoku, Tatsuo; Ishihara, Masahiro; Toyota, Junji; Shiozawa, Shusaku

    1991-05-01

    The integrity evaluation of the core graphite components for the High Temperature Engineering Test Reactor (HTTR) will be carried out based upon the Graphite Structural Design Code for core components. In the application of this design code, it is necessary to make clear the basic concept to evaluate the integrity of core components of HTTR. Therefore, considering the detailed design of core graphite structures such as fuel graphite blocks, etc. of HTTR, this report explicates the design code in detail about the concepts of stress and fatigue limits, integrity evaluation method of oxidized graphite components and thermal irradiation stress analysis method etc. (author)

  9. Lithographically encoded polymer microtaggant using high-capacity and error-correctable QR code for anti-counterfeiting of drugs.

    Science.gov (United States)

    Han, Sangkwon; Bae, Hyung Jong; Kim, Junhoi; Shin, Sunghwan; Choi, Sung-Eun; Lee, Sung Hoon; Kwon, Sunghoon; Park, Wook

    2012-11-20

    A QR-coded microtaggant for the anti-counterfeiting of drugs is proposed that can provide high capacity and error-correction capability. It is fabricated lithographically in a microfluidic channel with special consideration of the island patterns in the QR Code. The microtaggant is incorporated in the drug capsule ("on-dose authentication") and can be read by a simple smartphone QR Code reader application when removed from the capsule and washed free of drug. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Differential Regulation of Receptor Activation and Agonist Selectivity by Highly Conserved Tryptophans in the Nicotinic Acetylcholine Receptor Binding Site

    OpenAIRE

    Williams, Dustin K.; Stokes, Clare; Horenstein, Nicole A.; Papke, Roger L.

    2009-01-01

    We have shown previously that a highly conserved Tyr in the nicotinic acetylcholine receptor (nAChR) ligand-binding domain (LBD) (α7 Tyr188 or α4 Tyr195) differentially regulates the activity of acetylcholine (ACh) and the α7-selective agonist 3-(4-hydroxy,2-methoxybenzylidene)anabaseine (4OH-GTS-21) in α4β2 and α7 nAChR. In this study, we mutated two highly conserved LBD Trp residues in human α7 and α4β2 and expressed the receptors in Xenopus laevis oocytes. α7 Re...

  11. Four-D propagation code for high-energy laser beams: a user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Morris, J.R.

    1976-08-05

    This manual describes the use and structure of the June 30, 1976 version of the Four-D propagation code for high energy laser beams. It provides selected sample output from a typical run and from several debug runs. The Four-D code now includes the important noncoplanar scenario feature. Many problems that required excessive computer time can now be meaningfully simulated as steady-state noncoplanar problems with short run times.

  12. Acacia shrubs respond positively to high severity wildfire: Implications for conservation and fuel hazard management.

    Science.gov (United States)

    Gordon, Christopher E; Price, Owen F; Tasker, Elizabeth M; Denham, Andrew J

    2017-01-01

    High severity wildfires pose threats to human assets, but are also perceived to impact vegetation communities because a small number of species may become dominant immediately after fire. However there are considerable gaps in our knowledge about species-specific responses of plants to different fire severities, and how this influences fuel hazard in the short and long-term. Here we conduct a floristic survey at sites before and two years after a wildfire of unprecedented size and severity in the Warrumbungle National Park (Australia) to explore relationships between post-fire growth of a fire responsive shrub genera (Acacia), total mid-story vegetation cover, fire severity and fuel hazard. We then survey 129 plots surrounding the park to assess relationships between mid-story vegetation cover and time-since-fire. Acacia species richness and cover were 2.3 and 4.3 times greater at plots after than before the fire. However the same common dominant species were present throughout the study. Mid-story vegetation cover was 1.5 times greater after than before the wildfire, and Acacia species contribution to mid-story cover increased from 10 to 40%. Acacia species richness was not affected by fire severity, however strong positive associations were observed between Acacia and total mid-story vegetation cover and severity. Our analysis of mid-story vegetation recovery showed that cover was similarly high between 2 and 30years post-fire, then decreased until 52years. Collectively, our results suggest that Acacia species are extremely resilient to high severity wildfire and drive short to mid-term increases in fuel hazard. Our results are discussed in relation to fire regime management from the twin perspectives of conserving biodiversity and mitigating human losses due to wildfire. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  14. Uncertainties in calculations of nuclear design code system for the high temperature engineering test reactor (HTTR)

    International Nuclear Information System (INIS)

    Shindo, R.; Yamashita, K.; Murata, I.

    1991-01-01

    The nuclear design code system for the HTTR consists of one dimensional cell burnup computer code, developed in JAERI and the TWOTRAN-2 transport code. In order to satisfy related design criteria, uncertainty of the calculation was investigated by comparing the calculated and experimental results. The experiments were performed with a graphite moderated critical assembly. It was confirmed that discrepancies between calculations and experiments were small enough to be allowed in the nuclear design of HTTR. 8 refs, 6 figs

  15. Analysis code of three dimensional core dynamics for high temperature gas-cooled reactors, COMIC-2

    International Nuclear Information System (INIS)

    Takano, Makoto

    1987-04-01

    The code has been improved and modified in order to speedup calculation and to make more effective since it was developed in 1985. This report is written as a user's manual of the latest version of the code (COMIC-2). Speedup of the code is performed by the improvement of program flow and vector programming. The total speedup factor depends on problem, however, is about 10 in the case of a sample ploblem. (author)

  16. Reduction and resource recycling of high-level radioactive wastes through nuclear transmutation with PHITS code

    International Nuclear Information System (INIS)

    Fujita, Reiko

    2017-01-01

    In the ImPACT program of the Cabinet Office, programs are underway to reduce long-lived fission products (LLFP) contained in high-level radioactive waste through nuclear transmutation, or to recycle/utilize useful nuclear species. This paper outlines this program and describes recent achievements. This program consists of five projects: (1) separation/recovery technology, (2) acquisition of nuclear transmutation data, (3) nuclear reaction theory model and simulation, (4) novel nuclear reaction control and development of elemental technology, and (5) discussions on process concept. The project (1) develops a technology for dissolving vitrified solid, a technology for recovering LLFP from high-level waste liquid, and a technology for separating odd and even lasers. Project (2) acquires the new nuclear reaction data of Pd-107, Zr-93, Se-79, and Cs-135 using RIKEN's RIBF or JAEA's J-PARC. Project (3) improves new nuclear reaction theory and structural model using the nuclear reaction data measured in (2), improves/upgrades nuclear reaction simulation code PHITS, and proposes a promising nuclear transmutation pathway. Project (4) develops an accelerator that realizes the proposed transmutation route and its elemental technology. Project (5) performs the conceptual design of the process to realize (1) to (4), and constructs the scenario of reducing/utilizing high-level radioactive waste to realize this design. (A.O.)

  17. A dominant EV71-specific CD4+ T cell epitope is highly conserved among human enteroviruses.

    Directory of Open Access Journals (Sweden)

    Ruicheng Wei

    Full Text Available CD4+ T cell-mediated immunity plays a central role in determining the immunopathogenesis of viral infections. However, the role of CD4+ T cells in EV71 infection, which causes hand, foot and mouth disease (HFMD, has yet to be elucidated. We applied a sophisticated method to identify promiscuous CD4+ T cell epitopes contained within the sequence of the EV71 polyprotein. Fifteen epitopes were identified, and three of them are dominant ones. The most dominant epitope is highly conserved among enterovirus species, including HFMD-related coxsackieviruses, HFMD-unrelated echoviruses and polioviruses. Furthermore, the CD4+ T cells specific to the epitope indeed cross-reacted with the homolog of poliovirus 3 Sabin. Our findings imply that CD4+ T cell responses to poliovirus following vaccination, or to other enteroviruses to which individuals may be exposed in early childhood, may have a modulating effect on subsequent CD4+ T cell response to EV71 infection or vaccine.

  18. Comparative analyses reveal high levels of conserved colinearity between the finger millet and rice genomes.

    Science.gov (United States)

    Srinivasachary; Dida, Mathews M; Gale, Mike D; Devos, Katrien M

    2007-08-01

    Finger millet is an allotetraploid (2n = 4x = 36) grass that belongs to the Chloridoideae subfamily. A comparative analysis has been carried out to determine the relationship of the finger millet genome with that of rice. Six of the nine finger millet homoeologous groups corresponded to a single rice chromosome each. Each of the remaining three finger millet groups were orthologous to two rice chromosomes, and in all the three cases one rice chromosome was inserted into the centromeric region of a second rice chromosome to give the finger millet chromosomal configuration. All observed rearrangements were, among the grasses, unique to finger millet and, possibly, the Chloridoideae subfamily. Gene orders between rice and finger millet were highly conserved, with rearrangements being limited largely to single marker transpositions and small putative inversions encompassing at most three markers. Only some 10% of markers mapped to non-syntenic positions in rice and finger millet and the majority of these were located in the distal 14% of chromosome arms, supporting a possible correlation between recombination and sequence evolution as has previously been observed in wheat. A comparison of the organization of finger millet, Panicoideae and Pooideae genomes relative to rice allowed us to infer putative ancestral chromosome configurations in the grasses.

  19. A highly conserved metalloprotease effector enhances virulence in the maize anthracnose fungus Colletotrichum graminicola.

    Science.gov (United States)

    Sanz-Martín, José M; Pacheco-Arjona, José Ramón; Bello-Rico, Víctor; Vargas, Walter A; Monod, Michel; Díaz-Mínguez, José M; Thon, Michael R; Sukno, Serenella A

    2016-09-01

    Colletotrichum graminicola causes maize anthracnose, an agronomically important disease with a worldwide distribution. We have identified a fungalysin metalloprotease (Cgfl) with a role in virulence. Transcriptional profiling experiments and live cell imaging show that Cgfl is specifically expressed during the biotrophic stage of infection. To determine whether Cgfl has a role in virulence, we obtained null mutants lacking Cgfl and performed pathogenicity and live microscopy assays. The appressorium morphology of the null mutants is normal, but they exhibit delayed development during the infection process on maize leaves and roots, showing that Cgfl has a role in virulence. In vitro chitinase activity assays of leaves infected with wild-type and null mutant strains show that, in the absence of Cgfl, maize leaves exhibit increased chitinase activity. Phylogenetic analyses show that Cgfl is highly conserved in fungi. Similarity searches, phylogenetic analysis and transcriptional profiling show that C. graminicola encodes two LysM domain-containing homologues of Ecp6, suggesting that this fungus employs both Cgfl-mediated and LysM protein-mediated strategies to control chitin signalling. © 2015 BSPP and John Wiley & Sons Ltd.

  20. Pig epidermal growth factor precursor contains segments that are highly conserved among species

    DEFF Research Database (Denmark)

    Jørgensen, P E; Jensen, L.G.; Sørensen, B S

    1998-01-01

    segment with that of the human, the rat and the mouse EGF precursors, in order to identify highly conserved domains. The examined part of the precursor contains EGF itself and six so-called EGF-like modules. The overall amino acid identity among the four species is 64%. However, the amino acid identity...... differed from around 30% in some segments to around 70% in others. The highest amino acid identity, 71%, was observed for a 345-aa segment that contains three EGF-like modules and which is homologous to a part of the low-density lipoprotein receptor (LDL receptor). The amino acid identities are 64% for EGF...... itself, and 50-67% for the remaining three EGF-like modules. The segment of the LDL receptor that is homologous to a part of the EGF precursor is important for the function of the LDL receptor, and EGF-like modules seem to be involved in protein-protein interactions in a number of proteins. In conclusion...

  1. Current situation of energy conservation in high energy-consuming industries in Taiwan

    International Nuclear Information System (INIS)

    Chan, D.Y.-L.; Yang, K.-H.; Hsu, C.-H.; Chien, M.-H.; Hong, G.-B.

    2007-01-01

    Growing concern in Taiwan has arisen about energy consumption and its adverse environmental impact. The current situation of energy conservation in high energy-consuming industries in Taiwan, including the iron and steel, chemical, cement, pulp and paper, textiles and electric/electrical industries has been presented. Since the energy consumption of the top 100 energy users (T100) comprised over 50% of total industry energy consumption, focusing energy consumption reduction efforts on T100 energy users can achieve significant results. This study conducted on-site energy audits of 314 firms in Taiwan during 2000-2004, and identified potential electricity savings of 1,022,656 MWH, fuel oil savings of 174,643 kiloliters (KL), steam coal savings of 98,620 ton, and natural gas (NG) savings of 10,430 kilo cubic meters. The total potential energy saving thus was 489,505 KL of crude oil equivalent (KLOE), representing a reduction of 1,447,841 ton in the carbon dioxide emissions, equivalent to the annual carbon dioxide absorption capacity of a 39,131-ha plantation forest

  2. Ontario emissions trading code : emission reduction credit creation, recording and transfer rules, rules for renewable energy projects and conservation projects, and rules for the operation of the Ontario Emissions Trading Registry

    International Nuclear Information System (INIS)

    2001-12-01

    Emissions trading has been an integral part of Ontario's air quality strategy since December 31, 2001. Ontario has adopted the 'cap, credit and trade' type of emissions trading system, a hybrid that takes the best features of pure 'cap-and-trade' and 'baseline-and-credit' type systems. It covers nitric oxide and sulphur dioxide. The Ontario Emissions Trading Code supplements Ontario Regulation 397/01 and sets out rules for renewable energy projects and conservation projects for which applications for emission allowances can be made. This Code describes the rules for the creation and transfer of emission reduction credits (ERCs). It also explains the rules for the operation of the registry that has been established to provide information to the public about the emissions trading program and records decisions about credit creation and credit and allowance retirement. 3 tabs

  3. Fast Binary Coding for the Scene Classification of High-Resolution Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    Fan Hu

    2016-06-01

    Full Text Available Scene classification of high-resolution remote sensing (HRRS imagery is an important task in the intelligent processing of remote sensing images and has attracted much attention in recent years. Although the existing scene classification methods, e.g., the bag-of-words (BOW model and its variants, can achieve acceptable performance, these approaches strongly rely on the extraction of local features and the complicated coding strategy, which are usually time consuming and demand much expert effort. In this paper, we propose a fast binary coding (FBC method, to effectively generate efficient discriminative scene representations of HRRS images. The main idea is inspired by the unsupervised feature learning technique and the binary feature descriptions. More precisely, equipped with the unsupervised feature learning technique, we first learn a set of optimal “filters” from large quantities of randomly-sampled image patches and then obtain feature maps by convolving the image scene with the learned filters. After binarizing the feature maps, we perform a simple hashing step to convert the binary-valued feature map to the integer-valued feature map. Finally, statistical histograms computed on the integer-valued feature map are used as global feature representations of the scenes of HRRS images, similar to the conventional BOW model. The analysis of the algorithm complexity and experiments on HRRS image datasets demonstrate that, in contrast with existing scene classification approaches, the proposed FBC has much faster computational speed and achieves comparable classification performance. In addition, we also propose two extensions to FBC, i.e., the spatial co-occurrence matrix and different visual saliency maps, for further improving its final classification accuracy.

  4. High precision conformal radiotherapy employing conservative margins in childhood benign and low-grade brain tumours

    International Nuclear Information System (INIS)

    Jalali, Rakesh; Budrukkar, Ashwini; Sarin, Rajiv; Sharma, Dayananda S.

    2005-01-01

    Background and purpose: To report local control and follow up outcome data of high precision conformal radiotherapy in childhood brain tumours. Materials and methods: Between December 1999 and December 2002, 26 children (17 boys and 9 girls, median age 11.5 years) with incompletely excised or recurrent benign and low-grade brain tumours [13 craniopharyngiomas, 11 low-grade gliomas (LGG) and 2 others] were treated with three-dimensional (3D) conformal radiotherapy (CRT) (12 patients) and stereotactic conformal radiotherapy (SCRT) (14 patients). Gross tumour volume (GTV) included neuro-imaging based visible tumour and/or resected tumour bed. Clinical target volume (CTV) consisted of GTV + 5 mm margin and planning target volume (PTV) consisted of additional 5 mm margin for CRT and 2 mm for SCRT. Treatment was delivered with 3-9 conformal fixed fields to a median dose of 54 Gy/30 fractions. Results: The actuarial 2 and 3 year disease free and overall survival was 96 and 100%, respectively (median follow up: 25 months, range 12-47 months). Radiological follow up available in 25 patients revealed complete response in 1, partial regression in 10, stable disease in 13 and progression in 1 patient (within the CTV). One patient with craniopharyngioma on a routine imaging revealed a mild asymptomatic cyst enlargement, which resolved with conservative management. A patient with chiasmatic glioma developed cystic degeneration and hydrocephalus 9 months after SCRT requiring cyst drainage and placement of a ventriculoperitoneal shunt. Conclusion: High-precision conformal techniques delivering irradiation to a computer generated target volume employing 7-10 mm 3D margins beyond the visible tumour and/or resected tumour bed appear to be safe in children with incompletely resected or recurrent benign and low-grade brain tumours, based on these data

  5. A novel, highly conserved metallothionein family in basidiomycete fungi and characterization of two representative SlMTa and SlMTb genes in the ectomycorrhizal fungus Suillus luteus.

    Science.gov (United States)

    Nguyen, Hoai; Rineau, François; Vangronsveld, Jaco; Cuypers, Ann; Colpaert, Jan V; Ruytinx, Joske

    2017-07-01

    The basidiomycete Suillus luteus is an important member of the ectomycorrhizal community that thrives in heavy metal polluted soils covered with pioneer pine forests. This study aimed to identify potential heavy metal chelators in S. luteus. Two metallothionein (MT) coding genes, SlMTa and SlMTb, were identified. When heterologously expressed in yeast, both SlMTa and SlMTb can rescue the Cu sensitive mutant from Cu toxicity. In S. luteus, transcription of both SlMTa and SlMTb is induced by Cu but not Cd or Zn. Several putative Cu-sensing and metal-response elements are present in the promoter sequences. These results indicate that SlMTa and SlMTb function as Cu-thioneins. Homologs of the S. luteus MTs are present in 49 species belonging to 10 different orders of the subphylum Agaricomycotina and are remarkably conserved. The length of the proteins, number and distribution of cysteine residues indicate a novel family of fungal MTs. The ubiquitous and highly conserved features of these MTs suggest that they are important for basic cellular functions in species in the subphylum Agaricomycotina. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  6. FPGA-Based Channel Coding Architectures for 5G Wireless Using High-Level Synthesis

    Directory of Open Access Journals (Sweden)

    Swapnil Mhaske

    2017-01-01

    Full Text Available We propose strategies to achieve a high-throughput FPGA architecture for quasi-cyclic low-density parity-check codes based on circulant-1 identity matrix construction. By splitting the node processing operation in the min-sum approximation algorithm, we achieve pipelining in the layered decoding schedule without utilizing additional hardware resources. High-level synthesis compilation is used to design and develop the architecture on the FPGA hardware platform. To validate this architecture, an IEEE 802.11n compliant 608 Mb/s decoder is implemented on the Xilinx Kintex-7 FPGA using the LabVIEW FPGA Compiler in the LabVIEW Communication System Design Suite. Architecture scalability was leveraged to accomplish a 2.48 Gb/s decoder on a single Xilinx Kintex-7 FPGA. Further, we present rapidly prototyped experimentation of an IEEE 802.16 compliant hybrid automatic repeat request system based on the efficient decoder architecture developed. In spite of the mixed nature of data processing—digital signal processing and finite-state machines—LabVIEW FPGA Compiler significantly reduced time to explore the system parameter space and to optimize in terms of error performance and resource utilization. A 4x improvement in the system throughput, relative to a CPU-based implementation, was achieved to measure the error-rate performance of the system over large, realistic data sets using accelerated, in-hardware simulation.

  7. Computer code to predict the heat of explosion of high energy materials

    International Nuclear Information System (INIS)

    Muthurajan, H.; Sivabalan, R.; Pon Saravanan, N.; Talawar, M.B.

    2009-01-01

    The computational approach to the thermochemical changes involved in the process of explosion of a high energy materials (HEMs) vis-a-vis its molecular structure aids a HEMs chemist/engineers to predict the important thermodynamic parameters such as heat of explosion of the HEMs. Such a computer-aided design will be useful in predicting the performance of a given HEM as well as in conceiving futuristic high energy molecules that have significant potential in the field of explosives and propellants. The software code viz., LOTUSES developed by authors predicts various characteristics of HEMs such as explosion products including balanced explosion reactions, density of HEMs, velocity of detonation, CJ pressure, etc. The new computational approach described in this paper allows the prediction of heat of explosion (ΔH e ) without any experimental data for different HEMs, which are comparable with experimental results reported in literature. The new algorithm which does not require any complex input parameter is incorporated in LOTUSES (version 1.5) and the results are presented in this paper. The linear regression analysis of all data point yields the correlation coefficient R 2 = 0.9721 with a linear equation y = 0.9262x + 101.45. The correlation coefficient value 0.9721 reveals that the computed values are in good agreement with experimental values and useful for rapid hazard assessment of energetic materials

  8. Development of 2D particle-in-cell code to simulate high current, low ...

    Indian Academy of Sciences (India)

    Abstract. A code for 2D space-charge dominated beam dynamics study in beam trans- port lines is developed. The code is used for particle-in-cell (PIC) simulation of z-uniform beam in a channel containing solenoids and drift space. It can also simulate a transport line where quadrupoles are used for focusing the beam.

  9. DELIGHT-B/REDEL, point reactivity burnup code for high-temperature gas-cooled reactor cells

    International Nuclear Information System (INIS)

    Shindo, Ryuiti; Watanabe, Takashi.

    1977-03-01

    Code DELIGHT-2 was previously developed to analyze cell burnup characteristics and to produce few-group constants for core burnup calculation in high-temperature gas-cooled reactors. In the code, burnup dependency of the burnable poison, boron-10, is considered with the homogeneous model of space. In actuality, however, the burnable poison is used as homogeneous rods or uniform rods of small granular poison and graphite, to control the reactivity and power distribution. Precise analysis of the burnup characteristics is thus difficult because of the heterogeneity due to the configuration of poison rods. In cell burnup calculation, the DELIGHT-B, which is a modification of DELIGHT-2, takes into consideration this heterogeneous effect. The auxiliary code REDEL, a reduction of DELIGHT-B, used in combination with 3 dimensional diffusion code CITATION, is for core burnup calculation with the macro-scopic cross section model. (auth.)

  10. Development of analytical code `ACCORD` for incore and plant dynamics of High Temperature Gas-cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Takeda, Takeshi; Tachibana, Yukio; Kunitomi, Kazuhiko [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Itakura, Hirofumi

    1996-11-01

    Safety demonstration test of the High Temperature Engineering Test Reactor will be carried out to demonstrate excellent safety features of a next generation High Temperature Gas-cooled Reactor (HTGR). Analytical code for incore and plant dynamics is necessary to assess the results of the safety demonstration test and to perform a design and safety analysis of the next generation HTGR. Existing analytical code for incore and plant dynamics of the HTGR can analyze behavior of plant system for only several thousand seconds after an event occurrence. Simulator on site can analyze only behavior of specific plant system. The `ACCORD` code has been, therefore, developed to analyze the incore and plant dynamics of the HTGR. The followings are the major characteristics of this code. (1) Plant system can be analyzed for over several thousand seconds after an event occurrence by modeling the heat capacity of the core. (2) Incore and plant dynamics of any plant system can be analyzed by rearranging packages which simulate plant system components one by one. (3) Thermal hydraulics for each component can be analyzed by separating heat transfer calculation for component from fluid flow calculation for helium and pressurized water systems. The validity of the `ACCORD` code including models for nuclear calculation, heat transfer and fluid flow calculation, control system and safety protection system, was confirmed through cross checks with other available codes. (author)

  11. Development of analytical code 'ACCORD' for incore and plant dynamics of High Temperature Gas-cooled Reactor

    International Nuclear Information System (INIS)

    Takeda, Takeshi; Tachibana, Yukio; Kunitomi, Kazuhiko; Itakura, Hirofumi.

    1996-11-01

    Safety demonstration test of the High Temperature Engineering Test Reactor will be carried out to demonstrate excellent safety features of a next generation High Temperature Gas-cooled Reactor (HTGR). Analytical code for incore and plant dynamics is necessary to assess the results of the safety demonstration test and to perform a design and safety analysis of the next generation HTGR. Existing analytical code for incore and plant dynamics of the HTGR can analyze behavior of plant system for only several thousand seconds after an event occurrence. Simulator on site can analyze only behavior of specific plant system. The 'ACCORD' code has been, therefore, developed to analyze the incore and plant dynamics of the HTGR. The followings are the major characteristics of this code. (1) Plant system can be analyzed for over several thousand seconds after an event occurrence by modeling the heat capacity of the core. (2) Incore and plant dynamics of any plant system can be analyzed by rearranging packages which simulate plant system components one by one. (3) Thermal hydraulics for each component can be analyzed by separating heat transfer calculation for component from fluid flow calculation for helium and pressurized water systems. The validity of the 'ACCORD' code including models for nuclear calculation, heat transfer and fluid flow calculation, control system and safety protection system, was confirmed through cross checks with other available codes. (author)

  12. The Highly Conserved Proline at Position 438 in Pseudorabies Virus gH Is Important for Regulation of Membrane Fusion

    OpenAIRE

    Schröter, Christina; Klupp, Barbara G.; Fuchs, Walter; Gerhard, Marika; Backovic, Marija; Rey, Felix A.; Mettenleiter, Thomas C.

    2014-01-01

    Membrane fusion in herpesviruses requires viral glycoproteins (g) gB and gH/gL. While gB is considered the actual fusion protein but is nonfusogenic per se, the function of gH/gL remains enigmatic. Crystal structures for different gH homologs are strikingly similar despite only moderate amino acid sequence conservation. A highly conserved sequence motif comprises the residues serine-proline-cysteine corresponding to positions 437 to 439 in pseudorabies virus (PrV) gH. The PrV-gH structure sho...

  13. High Girth Column-Weight-Two LDPC Codes Based on Distance Graphs

    Directory of Open Access Journals (Sweden)

    Gabofetswe Malema

    2007-01-01

    Full Text Available LDPC codes of column weight of two are constructed from minimal distance graphs or cages. Distance graphs are used to represent LDPC code matrices such that graph vertices that represent rows and edges are columns. The conversion of a distance graph into matrix form produces an adjacency matrix with column weight of two and girth double that of the graph. The number of 1's in each row (row weight is equal to the degree of the corresponding vertex. By constructing graphs with different vertex degrees, we can vary the rate of corresponding LDPC code matrices. Cage graphs are used as examples of distance graphs to design codes with different girths and rates. Performance of obtained codes depends on girth and structure of the corresponding distance graphs.

  14. Computer codes used in the calculation of high-temperature thermodynamic properties of sodium

    International Nuclear Information System (INIS)

    Fink, J.K.

    1979-12-01

    Three computer codes - SODIPROP, NAVAPOR, and NASUPER - were written in order to calculate a self-consistent set of thermodynamic properties for saturated, subcooled, and superheated sodium. These calculations incorporate new critical parameters (temperature, pressure, and density) and recently derived single equations for enthalpy and vapor pressure. The following thermodynamic properties have been calculated in these codes: enthalpy, heat capacity, entropy, vapor pressure, heat of vaporization, density, volumetric thermal expansion coefficient, compressibility, and thermal pressure coefficient. In the code SODIPROP, these properties are calculated for saturated and subcooled liquid sodium. Thermodynamic properties of saturated sodium vapor are calculated in the code NAVAPOR. The code NASUPER calculates thermodynamic properties for super-heated sodium vapor only for low (< 1644 K) temperatures. No calculations were made for the supercritical region

  15. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    Directory of Open Access Journals (Sweden)

    Wei Feng

    2016-03-01

    Full Text Available High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device or CMOS (complementary metal oxide semiconductor camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second gain in temporal resolution by using a 25 fps camera.

  16. Vocational Preference Inventory High Point Codes Versus Expressed Choices as Predictors of College Major and Career Entry

    Science.gov (United States)

    Gade, Eldon M.; Soliah, David

    1975-01-01

    For 151 male graduates of the University of North Dakota, expressed choices measured by preferences made as high school seniors on the ACT Student Profile Section were significantly more accurate predictors of graduating college major and of career entry occupation than were their Vocational Preference Inventory high point codes. (Author)

  17. Gains of ubiquitylation sites in highly conserved proteins in the human lineage

    Directory of Open Access Journals (Sweden)

    Kim Dong Seon

    2012-11-01

    Full Text Available Abstract Background Post-translational modification of lysine residues of specific proteins by ubiquitin modulates the degradation, localization, and activity of these target proteins. Here, we identified gains of ubiquitylation sites in highly conserved regions of human proteins that occurred during human evolution. Results We analyzed human ubiquitylation site data and multiple alignments of orthologous mammalian proteins including those from humans, primates, other placental mammals, opossum, and platypus. In our analysis, we identified 281 ubiquitylation sites in 252 proteins that first appeared along the human lineage during primate evolution: one protein had four novel sites; four proteins had three sites each; 18 proteins had two sites each; and the remaining 229 proteins had one site each. PML, which is involved in neurodevelopment and neurodegeneration, acquired three sites, two of which have been reported to be involved in the degradation of PML. Thirteen human proteins, including ERCC2 (also known as XPD and NBR1, gained human-specific ubiquitylated lysines after the human-chimpanzee divergence. ERCC2 has a Lys/Gln polymorphism, the derived (major allele of which confers enhanced DNA repair capacity and reduced cancer risk compared with the ancestral (minor allele. NBR1 and eight other proteins that are involved in the human autophagy protein interaction network gained a novel ubiquitylation site. Conclusions The gain of novel ubiquitylation sites could be involved in the evolution of protein degradation and other regulatory networks. Although gains of ubiquitylation sites do not necessarily equate to adaptive evolution, they are useful candidates for molecular functional analyses to identify novel advantageous genetic modifications and innovative phenotypes acquired during human evolution.

  18. High risks of losing genetic diversity in an endemic Mauritian gecko: implications for conservation.

    Directory of Open Access Journals (Sweden)

    Steeves Buckland

    Full Text Available Genetic structure can be a consequence of recent population fragmentation and isolation, or a remnant of historical localised adaptation. This poses a challenge for conservationists since misinterpreting patterns of genetic structure may lead to inappropriate management. Of 17 species of reptile originally found in Mauritius, only five survive on the main island. One of these, Phelsuma guimbeaui (lowland forest day gecko, is now restricted to 30 small isolated subpopulations following severe forest fragmentation and isolation due to human colonisation. We used 20 microsatellites in ten subpopulations and two mitochondrial DNA (mtDNA markers in 13 subpopulations to: (i assess genetic diversity, population structure and genetic differentiation of subpopulations; (ii estimate effective population sizes and migration rates of subpopulations; and (iii examine the phylogenetic relationships of haplotypes found in different subpopulations. Microsatellite data revealed significant population structure with high levels of genetic diversity and isolation by distance, substantial genetic differentiation and no migration between most subpopulations. MtDNA, however, showed no evidence of population structure, indicating that there was once a genetically panmictic population. Effective population sizes of ten subpopulations, based on microsatellite markers, were small, ranging from 44 to 167. Simulations suggested that the chance of survival and allelic diversity of some subpopulations will decrease dramatically over the next 50 years if no migration occurs. Our DNA-based evidence reveals an urgent need for a management plan for the conservation of P. guimbeaui. We identified 18 threatened and 12 viable subpopulations and discuss a range of management options that include translocation of threatened subpopulations to retain maximum allelic diversity, and habitat restoration and assisted migration to decrease genetic erosion and inbreeding for the viable

  19. The existence of High Conservation Value Forest (HCVF in Perum Perhutani KPH Kendal to support Implementation of FSC Certification

    Directory of Open Access Journals (Sweden)

    Sulistyowati Sri

    2018-01-01

    Full Text Available High Conservation Value Forest (HCVF is the identification of High Conservation Values that are important and need to be protected. Under FSC certification mechanism, HCVF becomes one of Principles and Criteria to attain certification. In this study, we identify the existence of HCVF in Perum Perhutani KPH Kendal to support implementation process of FSC certification. Qualitative method was conducted through observation and secondary data from Perum Perhutani KPH Kendal. Data analysis showed through ecolabel certification, Perum Perhutani KPH Kendal has been identified HCVF area covering 2,715.5 hectares consists of HCV 1 until 6. Secondary Natural Forest (HAS Subah and Kaliwungu for Ulolanang and Pagerwunung Nature Reserve buffer zone include as HCV 1.1, conservation area of leopard (Panthera pardus melas and Pangolin (Manis javanica.for HCV 1.2, conservation area of lutung (Trachypiyhecus auratus as endemic species for CITES App I and Critically Endangered species include as HCV 1.3, Goa kiskendo for bats species habitat include as HCV 1.4, regions of interest species for Deer (Cervus timorensis and Kepodang (Oriolus chinensis as HCV 2.3, Germplasm Protection Region/ KPPN area with high biodiversity include as HCV 3, river border area and water springs for HCV 4. While, utilization of firewood, grass for cattle fodder include as HCV 5 and 14 cultural sites include as HCV 6. From monitoring and evaluation of HCVF data, showed that in 2011-2015 the level of diversity for flora and fauna were increased.

  20. The existence of High Conservation Value Forest (HCVF) in Perum Perhutani KPH Kendal to support Implementation of FSC Certification

    Science.gov (United States)

    Sulistyowati, Sri; Hadi, Sudharto P.

    2018-02-01

    High Conservation Value Forest (HCVF) is the identification of High Conservation Values that are important and need to be protected. Under FSC certification mechanism, HCVF becomes one of Principles and Criteria to attain certification. In this study, we identify the existence of HCVF in Perum Perhutani KPH Kendal to support implementation process of FSC certification. Qualitative method was conducted through observation and secondary data from Perum Perhutani KPH Kendal. Data analysis showed through ecolabel certification, Perum Perhutani KPH Kendal has been identified HCVF area covering 2,715.5 hectares consists of HCV 1 until 6. Secondary Natural Forest (HAS) Subah and Kaliwungu for Ulolanang and Pagerwunung Nature Reserve buffer zone include as HCV 1.1, conservation area of leopard (Panthera pardus melas) and Pangolin (Manis javanica).for HCV 1.2, conservation area of lutung (Trachypiyhecus auratus) as endemic species for CITES App I and Critically Endangered species include as HCV 1.3, Goa kiskendo for bats species habitat include as HCV 1.4, regions of interest species for Deer (Cervus timorensis) and Kepodang (Oriolus chinensis) as HCV 2.3, Germplasm Protection Region/ KPPN area with high biodiversity include as HCV 3, river border area and water springs for HCV 4. While, utilization of firewood, grass for cattle fodder include as HCV 5 and 14 cultural sites include as HCV 6. From monitoring and evaluation of HCVF data, showed that in 2011-2015 the level of diversity for flora and fauna were increased.

  1. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  2. A structural modification of the two dimensional fuel behaviour analysis code FEMAXI-III with high-speed vectorized operation

    International Nuclear Information System (INIS)

    Yanagisawa, Kazuaki; Ishiguro, Misako; Yamazaki, Takashi; Tokunaga, Yasuo.

    1985-02-01

    Though the two-dimensional fuel behaviour analysis code FEMAXI-III has been developed by JAERI in form of optimized scalar computer code, the call for more efficient code usage generally arized from the recent trends like high burn-up and load follow operation asks the code into further modification stage. A principal aim of the modification is to transform the already implemented scalar type subroutines into vectorized forms to make the programme structure efficiently run on high-speed vector computers. The effort of such structural modification has been finished on a fair way to success. The benchmarking two tests subsequently performed to examine the effect of the modification led us the following concluding remarks: (1) In the first benchmark test, comparatively high-burned three fuel rods that have been irradiated in HBWR, BWR, and PWR condition are prepared. With respect to all cases, a net computing time consumed in the vectorized FEMAXI is approximately 50 % less than that consumed in the original one. (2) In the second benchmark test, a total of 26 PWR fuel rods that have been irradiated in the burn-up ranges of 13-30 MWd/kgU and subsequently power ramped in R2 reactor, Sweden is prepared. In this case the code is purposed to be used for making an envelop of PCI-failure threshold through 26 times code runs. Before coming to the same conclusion, the vectorized FEMAXI-III consumed a net computing time 18 min., while the original FEMAXI-III consumed a computing time 36 min. respectively. (3) The effects obtained from such structural modification are found to be significantly attributed to saving a net computing time in a mechanical calculation in the vectorized FEMAXI-III code. (author)

  3. A point kernel shielding code, PKN-HP, for high energy proton incident

    Energy Technology Data Exchange (ETDEWEB)

    Kotegawa, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-06-01

    A point kernel integral technique code PKN-HP, and the related thick target neutron yield data have been developed to calculate neutron and secondary gamma-ray dose equivalents in ordinary concrete and iron shields for fully stopping length C, Cu and U-238 target neutrons produced by 100 MeV-10 GeV proton incident in a 3-dimensional geometry. The comparisons among calculation results of the present code and other calculation techniques, and measured values showed the usefulness of the code. (author)

  4. Performance Modeling and Optimization of a High Energy CollidingBeam Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-06-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms.

  5. Performance Modeling and Optimization of a High Energy Colliding Beam Simulation Code

    International Nuclear Information System (INIS)

    Shan, Hongzhang; Strohmaier, Erich; Qiang, Ji; Bailey, David H.; Yelick, Kathy

    2006-01-01

    An accurate modeling of the beam-beam interaction is essential to maximizing the luminosity in existing and future colliders. BeamBeam3D was the first parallel code that can be used to study this interaction fully self-consistently on high-performance computing platforms. Various all-to-all personalized communication (AAPC) algorithms dominate its communication patterns, for which we developed a sequence of performance models using a series of micro-benchmarks. We find that for SMP based systems the most important performance constraint is node-adapter contention, while for 3D-Torus topologies good performance models are not possible without considering link contention. The best average model prediction error is very low on SMP based systems with of 3% to 7%. On torus based systems errors of 29% are higher but optimized performance can again be predicted within 8% in some cases. These excellent results across five different systems indicate that this methodology for performance modeling can be applied to a large class of algorithms

  6. Amplitude-to-code converter for photomultipliers operating at high loadings

    International Nuclear Information System (INIS)

    Arkhangel'skij, B.V.; Evgrafov, G.N.; Pishchal'nikov, Yu.M.; Shuvalov, R.S.

    1982-01-01

    An 11-bit amplitude-to-code converter intended for the analysis of photomultiplier pulses under high loadings is described. To decrease the volume of digit electronics in the converter an analog memory on capacities is envisaged. A well-known bridge circuit with diodes on the main carriers is selected as a gating circuit. The gate control is realized by a switching circuit on fast-response transistors with boundary frequency of 1.2-1.5 GHz. The converter main characteristics are given, namely, maximum output signal amplitude equal to -1.5 V, minimum pulse selection duration of 10 ns, maximum number of counts at Usub(input)=-1.0 V and tsub(selection)=50 ns amounting to 1400, integral nonlinearity of +-0.1%, conversion temperature instability of 0.2%/deg C in the temperature range of (+10-+40) deg C, maximum time of data storage equal to 300 ms, conversion coefficient instability of 0.42 counts, number of channels in a unit CAMAC block equal to 12

  7. Near-fault earthquake ground motion prediction by a high-performance spectral element numerical code

    International Nuclear Information System (INIS)

    Paolucci, Roberto; Stupazzini, Marco

    2008-01-01

    Near-fault effects have been widely recognised to produce specific features of earthquake ground motion, that cannot be reliably predicted by 1D seismic wave propagation modelling, used as a standard in engineering applications. These features may have a relevant impact on the structural response, especially in the nonlinear range, that is hard to predict and to be put in a design format, due to the scarcity of significant earthquake records and of reliable numerical simulations. In this contribution a pilot study is presented for the evaluation of seismic ground-motions in the near-fault region, based on a high-performance numerical code for 3D seismic wave propagation analyses, including the seismic fault, the wave propagation path and the near-surface geological or topographical irregularity. For this purpose, the software package GeoELSE is adopted, based on the spectral element method. The set-up of the numerical benchmark of 3D ground motion simulation in the valley of Grenoble (French Alps) is chosen to study the effect of the complex interaction between basin geometry and radiation mechanism on the variability of earthquake ground motion

  8. Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code

    Science.gov (United States)

    Panettieri, Vanessa; Amor Duch, Maria; Jornet, Núria; Ginjaume, Mercè; Carrasco, Pablo; Badal, Andreu; Ortega, Xavier; Ribas, Montserrat

    2007-01-01

    The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson&Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm2 and a thickness of 0.5 µm which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water™ build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water™ cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system can

  9. Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code.

    Science.gov (United States)

    Panettieri, Vanessa; Duch, Maria Amor; Jornet, Núria; Ginjaume, Mercè; Carrasco, Pablo; Badal, Andreu; Ortega, Xavier; Ribas, Montserrat

    2007-01-07

    The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson&Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm(2) and a thickness of 0.5 microm which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system can successfully

  10. Development of Safety Analysis Codes and Experimental Validation for a Very High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang, H. Oh, PhD; Cliff Davis; Richard Moore

    2004-11-01

    The very high temperature gas-cooled reactors (VHTGRs) are those concepts that have average coolant temperatures above 900 degrees C or operational fuel temperatures above 1250 degrees C. These concepts provide the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation and nuclear hydrogen generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperatures to support process heat applications, such as desalination and cogeneration, the VHTGR's higher temperatures are suitable for particular applications such as thermochemical hydrogen production. However, the high temperature operation can be detrimental to safety following a loss-of-coolant accident (LOCA) initiated by pipe breaks caused by seismic or other events. Following the loss of coolant through the break and coolant depressurization, air from the containment will enter the core by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structures and fuel. The oxidation will release heat and accelerate the heatup of the reactor core. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. The Idaho National Engineering and Environmental Laboratory (INEEL) has investigated this event for the past three years for the HTGR. However, the computer codes used, and in fact none of the world's computer codes, have been sufficiently developed and validated to reliably predict this event. New code development, improvement of the existing codes, and experimental validation are imperative to narrow the uncertaninty in the predictions of this type of accident. The objectives of this Korean/United States collaboration are to develop advanced computational methods for VHTGR safety analysis codes and to validate these computer codes.

  11. High-frequency ultrasound for intraoperative margin assessments in breast conservation surgery: a feasibility study

    International Nuclear Information System (INIS)

    Doyle, Timothy E; Neumayer, Leigh A; Factor, Rachel E; Ellefson, Christina L; Sorensen, Kristina M; Ambrose, Brady J; Goodrich, Jeffrey B; Hart, Vern P; Jensen, Scott C; Patel, Hemang

    2011-01-01

    In addition to breast imaging, ultrasound offers the potential for characterizing and distinguishing between benign and malignant breast tissues due to their different microstructures and material properties. The aim of this study was to determine if high-frequency ultrasound (20-80 MHz) can provide pathology sensitive measurements for the ex vivo detection of cancer in margins during breast conservation surgery. Ultrasonic tests were performed on resected margins and other tissues obtained from 17 patients, resulting in 34 specimens that were classified into 15 pathology categories. Pulse-echo and through-transmission measurements were acquired from a total of 57 sites on the specimens using two single-element 50-MHz transducers. Ultrasonic attenuation and sound speed were obtained from time-domain waveforms. The waveforms were further processed with fast Fourier transforms to provide ultrasonic spectra and cepstra. The ultrasonic measurements and pathology types were analyzed for correlations. The specimens were additionally re-classified into five pathology types to determine specificity and sensitivity values. The density of peaks in the ultrasonic spectra, a measure of spectral structure, showed significantly higher values for carcinomas and precancerous pathologies such as atypical ductal hyperplasia than for normal tissue. The slopes of the cepstra for non-malignant pathologies displayed significantly greater values that differentiated them from the normal and malignant tissues. The attenuation coefficients were sensitive to fat necrosis, fibroadenoma, and invasive lobular carcinoma. Specificities and sensitivities for differentiating pathologies from normal tissue were 100% and 86% for lobular carcinomas, 100% and 74% for ductal carcinomas, 80% and 82% for benign pathologies, and 80% and 100% for fat necrosis and adenomas. Specificities and sensitivities were also determined for differentiating each pathology type from the other four using a multivariate

  12. High-frequency ultrasound for intraoperative margin assessments in breast conservation surgery: a feasibility study

    Directory of Open Access Journals (Sweden)

    Hart Vern P

    2011-10-01

    Full Text Available Abstract Background In addition to breast imaging, ultrasound offers the potential for characterizing and distinguishing between benign and malignant breast tissues due to their different microstructures and material properties. The aim of this study was to determine if high-frequency ultrasound (20-80 MHz can provide pathology sensitive measurements for the ex vivo detection of cancer in margins during breast conservation surgery. Methods Ultrasonic tests were performed on resected margins and other tissues obtained from 17 patients, resulting in 34 specimens that were classified into 15 pathology categories. Pulse-echo and through-transmission measurements were acquired from a total of 57 sites on the specimens using two single-element 50-MHz transducers. Ultrasonic attenuation and sound speed were obtained from time-domain waveforms. The waveforms were further processed with fast Fourier transforms to provide ultrasonic spectra and cepstra. The ultrasonic measurements and pathology types were analyzed for correlations. The specimens were additionally re-classified into five pathology types to determine specificity and sensitivity values. Results The density of peaks in the ultrasonic spectra, a measure of spectral structure, showed significantly higher values for carcinomas and precancerous pathologies such as atypical ductal hyperplasia than for normal tissue. The slopes of the cepstra for non-malignant pathologies displayed significantly greater values that differentiated them from the normal and malignant tissues. The attenuation coefficients were sensitive to fat necrosis, fibroadenoma, and invasive lobular carcinoma. Specificities and sensitivities for differentiating pathologies from normal tissue were 100% and 86% for lobular carcinomas, 100% and 74% for ductal carcinomas, 80% and 82% for benign pathologies, and 80% and 100% for fat necrosis and adenomas. Specificities and sensitivities were also determined for differentiating each

  13. Compression and channel-coding algorithms for high-definition television signals

    Science.gov (United States)

    Alparone, Luciano; Benelli, Giuliano; Fabbri, A. F.

    1990-09-01

    In this paper results of investigations about the effects of channel errors in the transmission of images compressed by means of techniques based on Discrete Cosine Transform (DOT) and Vector Quantization (VQ) are presented. Since compressed images are heavily degraded by noise in the transmission channel more seriously for what concern VQ-coded images theoretical studies and simulations are presented in order to define and evaluate this degradation. Some channel coding schemes are proposed in order to protect information during transmission. Hamming codes (7 (15 and (31 have been used for DCT-compressed images more powerful codes such as Golay (23 for VQ-compressed images. Performances attainable with softdecoding techniques are also evaluated better quality images have been obtained than using classical hard decoding techniques. All tests have been carried out to simulate the transmission of a digital image from HDTV signal over an AWGN channel with P5K modulation.

  14. Adaptive Network Coded Clouds: High Speed Downloads and Cost-Effective Version Control

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Heide, Janus; Roetter, Daniel Enrique Lucani

    2018-01-01

    Although cloud systems provide a reliable and flexible storage solution, the use of a single cloud service constitutes a single point of failure, which can compromise data availability, download speed, and security. To address these challenges, we advocate for the use of multiple cloud storage...... providers simultaneously using network coding as the key enabling technology. Our goal is to study two challenges of network coded storage systems. First, the efficient update of the number of coded fragments per cloud in a system aggregating multiple clouds in order to boost the download speed of files. We...... developed a novel scheme using recoding with limited packets to trade-off storage space, reliability, and data retrieval speed. Implementation and measurements with commercial cloud providers show that up to 9x less network use is needed compared to other network coding schemes, while maintaining similar...

  15. Femtosecond Laser System for Research on High-Speed Optical Transmultiplexing and Coding

    National Research Council Canada - National Science Library

    Weiner, Andrew

    1997-01-01

    .... This would fill an important need in both TDM packet networks and bit-parallel WDM linds. The research also aims at experimental tests of an ultrashort pulse code-division, multiple-access (CDMA...

  16. Ultra high speed optical transmission using subcarrier-multiplexed four-dimensional LDPC-coded modulation.

    Science.gov (United States)

    Batshon, Hussam G; Djordjevic, Ivan; Schmidt, Ted

    2010-09-13

    We propose a subcarrier-multiplexed four-dimensional LDPC bit-interleaved coded modulation scheme that is capable of achieving beyond 480 Gb/s single-channel transmission rate over optical channels. Subcarrier-multiplexed four-dimensional LDPC coded modulation scheme outperforms the corresponding dual polarization schemes by up to 4.6 dB in OSNR at BER 10(-8).

  17. High-dynamic range compressive spectral imaging by grayscale coded aperture adaptive filtering

    Directory of Open Access Journals (Sweden)

    Nelson Eduardo Diaz

    2015-09-01

    Full Text Available The coded aperture snapshot spectral imaging system (CASSI is an imaging architecture which senses the three dimensional informa-tion of a scene with two dimensional (2D focal plane array (FPA coded projection measurements. A reconstruction algorithm takes advantage of the compressive measurements sparsity to recover the underlying 3D data cube. Traditionally, CASSI uses block-un-block coded apertures (BCA to spatially modulate the light. In CASSI the quality of the reconstructed images depends on the design of these coded apertures and the FPA dynamic range. This work presents a new CASSI architecture based on grayscaled coded apertu-res (GCA which reduce the FPA saturation and increase the dynamic range of the reconstructed images. The set of GCA is calculated in a real-time adaptive manner exploiting the information from the FPA compressive measurements. Extensive simulations show the attained improvement in the quality of the reconstructed images when GCA are employed.  In addition, a comparison between traditional coded apertures and GCA is realized with respect to noise tolerance.

  18. Geometrid moth assemblages reflect high conservation value of naturally regenerated secondary forests in temperate China

    NARCIS (Netherlands)

    Zou, Yi; Sang, Weiguo; Warren-Thomas, Eleanor; Axmacher, Jan Christoph

    2016-01-01

    The widespread destruction of mature forests in China has led to massive ecological degradation, counteracted in recent decades by substantial efforts to promote forest plantations and protect secondary forest ecosystems. The value of the resulting forests for biodiversity conservation is widely

  19. Liberals, Conservatives and Romantic Nationalists in Interwar Education Policy in Greece: "The High Mountains" Episode

    Science.gov (United States)

    Athanasiades, Harris

    2015-01-01

    Greek historiography of interwar education policy unproblematically accepts the assumption that the bone of contention between the "Liberal demoticists" and the "Conservative purists" was the language issue; particularly whether "demotic" or "katharevousa" should be the language of instruction in schooling.…

  20. Gene co-regulation is highly conserved in the evolution of eukaryotes and prokaryotes.

    NARCIS (Netherlands)

    Snel, B.; Noort, V. van; Huynen, M.A.

    2004-01-01

    Differences between species have been suggested to largely reside in the network of connections among the genes. Nevertheless, the rate at which these connections evolve has not been properly quantified. Here, we measure the extent to which co-regulation between pairs of genes is conserved over

  1. Androgen receptor status is highly conserved during tumor progression of breast cancer.

    Science.gov (United States)

    Grogg, André; Trippel, Mafalda; Pfaltz, Katrin; Lädrach, Claudia; Droeser, Raoul A; Cihoric, Nikola; Salhia, Bodour; Zweifel, Martin; Tapia, Coya

    2015-11-09

    highly conserved during tumor progression and a change only occurs in a small fraction (4.1 %). Our study supports the notion that targeting AR could be effective for many BC patients and that re-testing of AR status in formerly negative or mixed type BC's is recommended.

  2. Androgen receptor status is highly conserved during tumor progression of breast cancer

    International Nuclear Information System (INIS)

    Grogg, André; Trippel, Mafalda; Pfaltz, Katrin; Lädrach, Claudia; Droeser, Raoul A.; Cihoric, Nikola; Salhia, Bodour; Zweifel, Martin; Tapia, Coya

    2015-01-01

    With the advent of new and more efficient anti-androgen drugs targeting androgen receptor (AR) in breast cancer (BC) is becoming an increasingly important area of investigation. This would potentially be most useful in triple negative BC (TNBC), where better therapies are still needed. The assessment of AR status is generally performed on the primary tumor even if the tumor has already metastasized. Very little is known regarding discrepancies of AR status during tumor progression. To determine the prevalence of AR positivity, with emphasis on TNBCs, and to investigate AR status during tumor progression, we evaluated a large series of primary BCs and matching metastases and recurrences. AR status was performed on 356 primary BCs, 135 matching metastases, and 12 recurrences using a next-generation Tissue Microarray (ngTMA). A commercially available AR antibody was used to determine AR-status by immunohistochemistry. AR positivity was defined as any nuclear staining in tumor cells ≥1 %. AR expression was correlated with pathological tumor features of the primary tumor. Additionally, the concordance rate of AR expression between the different tumor sites was determined. AR status was positive in: 87 % (307/353) of primary tumors, 86.1 % (105/122) of metastases, and in 66.7 % (8/12) of recurrences. TNBC tested positive in 11.4 %, (4/35) of BCs. A discrepant result was seen in 4.3 % (5/117) of primary BC and matching lymph node (LN) metastases. Three AR negative primary BCs were positive in the matching LN metastasis, representing 17.6 % of all negative BCs with lymph node metastases (3/17). Two AR positive primary BCs were negative in the matching LN metastasis, representing 2.0 % of all AR positive BCs with LN metastases (2/100). No discrepancies were seen between primary BC and distant metastases or recurrence (n = 17). Most primary (87 %) and metastasized (86.1 %) BCs are AR positive including a significant fraction of TNBCs (11.4 %). Further, AR status is highly

  3. Conservation hotspots for the turtles on the high seas of the Atlantic Ocean.

    Directory of Open Access Journals (Sweden)

    Hsiang-Wen Huang

    Full Text Available Understanding the distribution of bycaught sea turtles could inform conservation strategies and priorities. This research analyses the distribution of turtles caught as longline fisheries bycatch on the high seas of the Atlantic Ocean. This research collected 18,142 bycatch observations and 47.1 million hooks from large-scale Taiwanese longline vessels in the Atlantic Ocean from June 2002 to December 2013. The coverage rates were ranged from 0.48% to 17.54% by year. Seven hundred and sixty-seven turtles were caught, and the major species were leatherback (59.8%, olive ridley (27.1% and loggerhead turtles (8.7%. Most olive ridley (81.7% and loggerhead (82.1% turtles were hooked, while the leatherbacks were both hooked (44.0% and entangled (31.8%. Depending on the species, 21.4% to 57.7% were dead when brought onboard. Most of the turtles were caught in tropical areas, especially in the Gulf of Guinea (15°N-10°S, 30°W-10°E, but loggerheads were caught in the south Atlantic Ocean (25°S-35°S, 40°W-10°E and 30°S-40°S, 55°W-45°W. The bycatch rate was the highest at 0.030 per 1000 hooks for leatherbacks in the tropical area. The bycatch rates of olive ridley ranged from 0 to 0.010 per thousand hooks. The loggerhead bycatch rates were higher in the northern and southern Atlantic Ocean and ranged from 0.0128 to 0.0239 per thousand hooks. Due to the characteristics of the Taiwanese deep-set longline fleet, bycatch rates were lower than those of coastal longline fisheries, but mortality rates were higher because of the long hours of operation. Gear and bait modification should be considered to reduce sea turtle bycatch and increase survival rates while reducing the use of shallow hooks would also be helpful.

  4. Conservation hotspots for the turtles on the high seas of the Atlantic Ocean.

    Science.gov (United States)

    Huang, Hsiang-Wen

    2015-01-01

    Understanding the distribution of bycaught sea turtles could inform conservation strategies and priorities. This research analyses the distribution of turtles caught as longline fisheries bycatch on the high seas of the Atlantic Ocean. This research collected 18,142 bycatch observations and 47.1 million hooks from large-scale Taiwanese longline vessels in the Atlantic Ocean from June 2002 to December 2013. The coverage rates were ranged from 0.48% to 17.54% by year. Seven hundred and sixty-seven turtles were caught, and the major species were leatherback (59.8%), olive ridley (27.1%) and loggerhead turtles (8.7%). Most olive ridley (81.7%) and loggerhead (82.1%) turtles were hooked, while the leatherbacks were both hooked (44.0%) and entangled (31.8%). Depending on the species, 21.4% to 57.7% were dead when brought onboard. Most of the turtles were caught in tropical areas, especially in the Gulf of Guinea (15°N-10°S, 30°W-10°E), but loggerheads were caught in the south Atlantic Ocean (25°S-35°S, 40°W-10°E and 30°S-40°S, 55°W-45°W). The bycatch rate was the highest at 0.030 per 1000 hooks for leatherbacks in the tropical area. The bycatch rates of olive ridley ranged from 0 to 0.010 per thousand hooks. The loggerhead bycatch rates were higher in the northern and southern Atlantic Ocean and ranged from 0.0128 to 0.0239 per thousand hooks. Due to the characteristics of the Taiwanese deep-set longline fleet, bycatch rates were lower than those of coastal longline fisheries, but mortality rates were higher because of the long hours of operation. Gear and bait modification should be considered to reduce sea turtle bycatch and increase survival rates while reducing the use of shallow hooks would also be helpful.

  5. Linkage disequilibrium of evolutionarily conserved regions in the human genome

    Directory of Open Access Journals (Sweden)

    Johnson Todd A

    2006-12-01

    Full Text Available Abstract Background The strong linkage disequilibrium (LD recently found in genic or exonic regions of the human genome demonstrated that LD can be increased by evolutionary mechanisms that select for functionally important loci. This suggests that LD might be stronger in regions conserved among species than in non-conserved regions, since regions exposed to natural selection tend to be conserved. To assess this hypothesis, we used genome-wide polymorphism data from the HapMap project and investigated LD within DNA sequences conserved between the human and mouse genomes. Results Unexpectedly, we observed that LD was significantly weaker in conserved regions than in non-conserved regions. To investigate why, we examined sequence features that may distort the relationship between LD and conserved regions. We found that interspersed repeats, and not other sequence features, were associated with the weak LD tendency in conserved regions. To appropriately understand the relationship between LD and conserved regions, we removed the effect of repetitive elements and found that the high degree of sequence conservation was strongly associated with strong LD in coding regions but not with that in non-coding regions. Conclusion Our work demonstrates that the degree of sequence conservation does not simply increase LD as predicted by the hypothesis. Rather, it implies that purifying selection changes the polymorphic patterns of coding sequences but has little influence on the patterns of functional units such as regulatory elements present in non-coding regions, since the former are generally restricted by the constraint of maintaining a functional protein product across multiple exons while the latter may exist more as individually isolated units.

  6. Development of a 3D FEL code for the simulation of a high-gain harmonic generation experiment

    International Nuclear Information System (INIS)

    Biedron, S. G.

    1999-01-01

    Over the last few years, there has been a growing interest in self-amplified spontaneous emission (SASE) free-electron lasers (FELs) as a means for achieving a fourth-generation light source. In order to correctly and easily simulate the many configurations that have been suggested, such as multi-segmented wigglers and the method of high-gain harmonic generation, we have developed a robust three-dimensional code. The specifics of the code, the comparison to the linear theory as well as future plans will be presented

  7. The modified high-energy transport code, HETC, and design calculations for the SSC [Superconducting Super Collider

    International Nuclear Information System (INIS)

    Alsmiller, R.G. Jr.; Alsmiller, F.S.; Gabriel, T.A.; Hermann, O.W.; Bishop, B.L.

    1988-01-01

    The proposed Superconducting Super Collider (SSC) will have two circulating proton beams, each with an energy of 20 TeV. In order to perform detector and shield design calculations at these higher energies that are as accurate as possible, it is necessary to incorporate in the calculations the best available information on differential particle production from hadron-nucleus collisions. In this paper, the manner in which this has been done in the High-Energy Transport Code HETC will be described and calculated results obtained with the modified code will be compared with experimental data. 10 refs., 1 fig

  8. THE END OF TRANSGENIC FOOD LABELING AND THE RIGHT TO INFORMATION CONSERVED BY THE CONSUMER DEFENSE CODE IN THE LIGHT OF THE FEDERAL CONSTITUTION OF 1988

    Directory of Open Access Journals (Sweden)

    Ingrid Lima Barbosa

    2018-03-01

    Full Text Available Bill No. 4.148/08 intends to eliminate the requirement for the “T” symbol in the packaging of products containing more than one percent of GMOs in its composition, due to the alleged negative charge that it presents, going against what is advocated by the Biosafety Law, the Consumer Defense Code, as well as the Federal Constitution. Thus, the objective is to analyze the consequences to consumers, in case the bill is eventually sanctioned, as well as if there is an affront to the fundamental precepts listed in the Magna Carta and other legal diplomas, through the inductive method of research supported by the bibliographic collection available. It was concluded that, in addition to confronting the provisions of the Consumer Defense Code, there is also a violation of the Cartagena Protocol on Biosafety, as well as the material unconstitutionality of Bill No. 4.148/08, resulting from the affront to articles 5, XIV and XXXII, and 170, V of the Constitution.

  9. High Resolution Mapping of Soils and Landforms for the Desert Renewable Energy Conservation Plan (DRECP)

    Science.gov (United States)

    Potter, Christopher S.; Li, Shuang

    2014-01-01

    The Desert Renewable Energy Conservation Plan (DRECP), a major component of California's renewable energy planning efforts, is intended to provide effective protection and conservation of desert ecosystems, while allowing for the sensible development of renewable energy projects. This NASA mapping report was developed to support the DRECP and the Bureau of Land Management (BLM). We outline in this document remote sensing image processing methods to deliver new maps of biological soils crusts, sand dune movements, desert pavements, and sub-surface water sources across the DRECP area. We focused data processing first on the largely unmapped areas most likely to be used for energy developments, such as those within Renewable Energy Study Areas (RESA) and Solar Energy Zones (SEZs). We used imagery (multispectral and radar) mainly from the years 2009-2011.

  10. High-Quality 3d Models and Their Use in a Cultural Heritage Conservation Project

    Science.gov (United States)

    Tucci, G.; Bonora, V.; Conti, A.; Fiorini, L.

    2017-08-01

    Cultural heritage digitization and 3D modelling processes are mainly based on laser scanning and digital photogrammetry techniques to produce complete, detailed and photorealistic three-dimensional surveys: geometric as well as chromatic aspects, in turn testimony of materials, work techniques, state of preservation, etc., are documented using digitization processes. The paper explores the topic of 3D documentation for conservation purposes; it analyses how geomatics contributes in different steps of a restoration process and it presents an overview of different uses of 3D models for the conservation and enhancement of the cultural heritage. The paper reports on the project to digitize the earthenware frieze of the Ospedale del Ceppo in Pistoia (Italy) for 3D documentation, restoration work support, and digital and physical reconstruction and integration purposes. The intent to design an exhibition area suggests new ways to take advantage of 3D data originally acquired for documentation and scientific purposes.

  11. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    CERN Document Server

    Böhlen, T T; Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Mairani, A; Sala, P R; Smirnov, G; Vlachoudis, V

    2014-01-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  12. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Science.gov (United States)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  13. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    Directory of Open Access Journals (Sweden)

    DeTar Carleton

    2018-01-01

    Full Text Available With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  14. A High-Accuracy Linear Conservative Difference Scheme for Rosenau-RLW Equation

    Directory of Open Access Journals (Sweden)

    Jinsong Hu

    2013-01-01

    Full Text Available We study the initial-boundary value problem for Rosenau-RLW equation. We propose a three-level linear finite difference scheme, which has the theoretical accuracy of Oτ2+h4. The scheme simulates two conservative properties of original problem well. The existence, uniqueness of difference solution, and a priori estimates in infinite norm are obtained. Furthermore, we analyze the convergence and stability of the scheme by energy method. At last, numerical experiments demonstrate the theoretical results.

  15. Hearing sensitivity in context: Conservation implications for a highly vocal endangered species

    OpenAIRE

    Owen, Megan A.; Keating, Jennifer L.; Denes, Samuel K.; Hawk, Kathy; Fiore, Angela; Thatcher, Julie; Becerra, Jennifer; Hall, Suzanne; Swaisgood, Ronald R.

    2016-01-01

    Hearing sensitivity is a fundamental determinant of a species’ vulnerability to anthropogenic noise, however little is known about the hearing capacities of most conservation dependent species. When audiometric data are integrated with other aspects of species’ acoustic ecology, life history, and characteristic habitat topography and soundscape, predictions can be made regarding probable vulnerability to the negative impacts of different types of anthropogenic noise. Here we used an adaptive ...

  16. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    Science.gov (United States)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  17. High-resolution coded-aperture design for compressive X-ray tomography using low resolution detectors

    Science.gov (United States)

    Mojica, Edson; Pertuz, Said; Arguello, Henry

    2017-12-01

    One of the main challenges in Computed Tomography (CT) is obtaining accurate reconstructions of the imaged object while keeping a low radiation dose in the acquisition process. In order to solve this problem, several researchers have proposed the use of compressed sensing for reducing the amount of measurements required to perform CT. This paper tackles the problem of designing high-resolution coded apertures for compressed sensing computed tomography. In contrast to previous approaches, we aim at designing apertures to be used with low-resolution detectors in order to achieve super-resolution. The proposed method iteratively improves random coded apertures using a gradient descent algorithm subject to constraints in the coherence and homogeneity of the compressive sensing matrix induced by the coded aperture. Experiments with different test sets show consistent results for different transmittances, number of shots and super-resolution factors.

  18. High-dimensional structured light coding/decoding for free-space optical communications free of obstructions.

    Science.gov (United States)

    Du, Jing; Wang, Jian

    2015-11-01

    Bessel beams carrying orbital angular momentum (OAM) with helical phase fronts exp(ilφ)(l=0;±1;±2;…), where φ is the azimuthal angle and l corresponds to the topological number, are orthogonal with each other. This feature of Bessel beams provides a new dimension to code/decode data information on the OAM state of light, and the theoretical infinity of topological number enables possible high-dimensional structured light coding/decoding for free-space optical communications. Moreover, Bessel beams are nondiffracting beams having the ability to recover by themselves in the face of obstructions, which is important for free-space optical communications relying on line-of-sight operation. By utilizing the OAM and nondiffracting characteristics of Bessel beams, we experimentally demonstrate 12 m distance obstruction-free optical m-ary coding/decoding using visible Bessel beams in a free-space optical communication system. We also study the bit error rate (BER) performance of hexadecimal and 32-ary coding/decoding based on Bessel beams with different topological numbers. After receiving 500 symbols at the receiver side, a zero BER of hexadecimal coding/decoding is observed when the obstruction is placed along the propagation path of light.

  19. Innovative Surgical Management of the Synovial Chondromatosis of Temporo-Mandibular Joints: Highly Conservative Surgical Technique.

    Science.gov (United States)

    Ionna, Franco; Amantea, Massimiliano; Mastrangelo, Filiberto; Ballini, Andrea; Maglione, Maria Grazia; Aversa, Corrado; De Cecio, Rossella; Russo, Daniela; Marrelli, Massimo; Tatullo, Marco

    2016-07-01

    Synovial chondromatosis (SC) is an uncommon disease characterized by a benign nodular cartilaginous proliferation arising from the joint synovium, bursae, or tendon sheaths. Although the temporomandibular joint is rarely affected by neoplastic lesions, SC is the most common neoplastic lesion of this joint. The treatment of this disease consists in the extraoral surgery with a wide removal of the lesion; in this study, the authors described a more conservative intraoral surgical approach. Patient with SC of temporomandibular joint typically refer a limitation in the mouth opening, together with a persistent not physiological mandibular protrusion and an appearance of a neoformation located at the right preauricular region: the authors reported 1 scholar patient. After biopsy of the neoformation, confirming the synovial chondromatosis, the patient underwent thus to the surgical excision of the tumor, via authors' conservative transoral approach, to facilitate the enucleation of the neoformation. The mass fully involved the pterygo-maxillary fossa with involvement of the parotid lodge and of the right TMJ: this multifocal extension suggested for a trans-oral surgical procedure, in the light of the suspicion of a possible malignant nature of the neoplasm. Our intraoral conservative approach to surgery is aimed to reduce the presence of unaesthetic scars in preauricular and facial regions, with surgical results undoubtedly comparable to the traditional surgical techniques much more aggressive. Our technique could be a valid, alternative, and safe approach to treat this rare and complex kind of oncological disease.

  20. Spallation integral experiment analysis by high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Meigo, Shin-ichiro; Sasa, Toshinobu; Fukahori, Tokio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Yoshizawa, Nobuaki; Furihata, Shiori; Belyakov-Bodin, V.I.; Krupny, G.I.; Titarenko, Y.E.

    1997-03-01

    Reaction rate distributions were measured with various activation detectors on the cylindrical surface of the thick tungsten target of 20 cm in diameter and 60 cm in length bombarded with the 0.895 and 1.21 GeV protons. The experimental results were analyzed with the Monte Carlo simulation code systems of NMTC/JAERI-MCNP-4A, LAHET and HERMES. It is confirmed that those code systems can represent the reaction rate distributions with the C/E ratio of 0.6 to 1.4 at the positions up to 30 cm from beam incident surface. (author)

  1. Absorbed dose determination in high energy photon beams using new IAEA TRS - 398 Code of Practice

    International Nuclear Information System (INIS)

    Suriyapee, S.; Srimanoroath, S.; Jumpangern, C.

    2002-01-01

    The absorbed dose calibration of 6 and 10 MV X-ray beams from Varian Clinac 1800 at King Chulalongkorn Memorial Hospital Bangkok, Thailand were performed using cylindrical chamber 0.6 cc NE2571 Serial No. 1633 with graphite wall and Delrin build up cap and lonex Dosemaster NE 2590 Serial No. 223. The absorbed dose determination followed the IAEA code of practice TRS-277. The new IAEA code of practice TRS-398 have been studied to compare the result with the IAEA TRS-277

  2. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta

    2011-01-01

    The paper addresses the problem of distribution of highdefinition video over fiber-wireless networks. The physical layer architecture with the low complexity envelope detection solution is investigated. We present both experimental studies and simulation of high quality high-definition compressed...... video transmission over 60 GHz fiberwireless link. Using advanced video coding we satisfy low complexity and low delay constraints, meanwhile preserving the superb video quality after significantly extended wireless distance. © 2011 Optical Society of America....

  3. Verification of SIGACE code for generating ACE format cross-section files with continuous energy at high temperature

    International Nuclear Information System (INIS)

    Li Zhifeng; Yu Tao; Xie Jinsen; Qin Mian

    2012-01-01

    Based on the recently released ENDF/B-VII. 1 library, high temperature neutron cross-section files are generated through SIGACE code using low temperature ACE format files. To verify the processed ACE file of SIGACE, benchmark calculations are performed in this paper. The calculated results of selected ICT, standard CANDU assembly, LWR Doppler coefficient and SEFOR benchmarks are well conformed with reference value, which indicates that high temperature ACE files processed by SIGACE can be used in related neutronics calculations. (authors)

  4. Characterization of high-power RF structures using time-domain field codes

    International Nuclear Information System (INIS)

    Shang, C.C.; DeFord, J.F.; Swatloski, T.L.

    1992-01-01

    We have modeled gyrotron windows and gyrotron amplifier sever structures for TE modes in the 100--150 GHz range and have computed the reflection and transmission characteristics from the field data. Good agreement with frequency domain codes and analytic analysis have been obtained for some simple geometries. We present results for realistic structures with lousy coatings and describe implementation of microwave diagnostics

  5. The importance of incorporating functional habitats into conservation planning for highly mobile species in dynamic systems.

    Science.gov (United States)

    Webb, Matthew H; Terauds, Aleks; Tulloch, Ayesha; Bell, Phil; Stojanovic, Dejan; Heinsohn, Robert

    2017-10-01

    The distribution of mobile species in dynamic systems can vary greatly over time and space. Estimating their population size and geographic range can be problematic and affect the accuracy of conservation assessments. Scarce data on mobile species and the resources they need can also limit the type of analytical approaches available to derive such estimates. We quantified change in availability and use of key ecological resources required for breeding for a critically endangered nomadic habitat specialist, the Swift Parrot (Lathamus discolor). We compared estimates of occupied habitat derived from dynamic presence-background (i.e., presence-only data) climatic models with estimates derived from dynamic occupancy models that included a direct measure of food availability. We then compared estimates that incorporate fine-resolution spatial data on the availability of key ecological resources (i.e., functional habitats) with more common approaches that focus on broader climatic suitability or vegetation cover (due to the absence of fine-resolution data). The occupancy models produced significantly (P increase or decrease in the area of one functional habitat (foraging or nesting) did not necessarily correspond to an increase or decrease in the other. Thus, an increase in the extent of occupied area may not equate to improved habitat quality or function. We argue these patterns are typical for mobile resource specialists but often go unnoticed because of limited data over relevant spatial and temporal scales and lack of spatial data on the availability of key resources. Understanding changes in the relative availability of functional habitats is crucial to informing conservation planning and accurately assessing extinction risk for mobile resource specialists. © 2017 Society for Conservation Biology.

  6. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    International Nuclear Information System (INIS)

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-01-01

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  7. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens

    Energy Technology Data Exchange (ETDEWEB)

    Oelerich, Jan Oliver, E-mail: jan.oliver.oelerich@physik.uni-marburg.de; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D.; Volz, Kerstin

    2017-06-15

    Highlights: • We present STEMsalabim, a modern implementation of the multislice algorithm for simulation of STEM images. • Our package is highly parallelizable on high-performance computing clusters, combining shared and distributed memory architectures. • With STEMsalabim, computationally and memory expensive STEM image simulations can be carried out within reasonable time. - Abstract: We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space.

  8. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    Science.gov (United States)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  9. Using clinical data to predict high-cost performance coding issues associated with pressure ulcers: a multilevel cohort model.

    Science.gov (United States)

    Padula, William V; Gibbons, Robert D; Pronovost, Peter J; Hedeker, Donald; Mishra, Manish K; Makic, Mary Beth F; Bridges, John Fp; Wald, Heidi L; Valuck, Robert J; Ginensky, Adam J; Ursitti, Anthony; Venable, Laura Ruth; Epstein, Ziv; Meltzer, David O

    2017-04-01

    Hospital-acquired pressure ulcers (HAPUs) have a mortality rate of 11.6%, are costly to treat, and result in Medicare reimbursement penalties. Medicare codes HAPUs according to Agency for Healthcare Research and Quality Patient-Safety Indicator 3 (PSI-03), but they are sometimes inappropriately coded. The objective is to use electronic health records to predict pressure ulcers and to identify coding issues leading to penalties. We evaluated all hospitalized patient electronic medical records at an academic medical center data repository between 2011 and 2014. These data contained patient encounter level demographic variables, diagnoses, prescription drugs, and provider orders. HAPUs were defined by PSI-03: stages III, IV, or unstageable pressure ulcers not present on admission as a secondary diagnosis, excluding cases of paralysis. Random forests reduced data dimensionality. Multilevel logistic regression of patient encounters evaluated associations between covariates and HAPU incidence. The approach produced a sample population of 21 153 patients with 1549 PSI-03 cases. The greatest odds ratio (OR) of HAPU incidence was among patients diagnosed with spinal cord injury (ICD-9 907.2: OR = 14.3; P  coded for paralysis, leading to a PSI-03 flag. Other high ORs included bed confinement (ICD-9 V49.84: OR = 3.1, P  coded without paralysis, leading to PSI-03 flags. The resulting statistical model can be tested to predict HAPUs during hospitalization. Inappropriate coding of conditions leads to poor hospital performance measures and Medicare reimbursement penalties. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  10. Status of the development of a fully integrated code system for the simulation of high temperature reactor cores

    Energy Technology Data Exchange (ETDEWEB)

    Kasselmann, Stefan, E-mail: s.kasselmann@fz-juelich.de [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Druska, Claudia [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Herber, Stefan [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany); Jühe, Stephan [Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany); Keller, Florian; Lambertz, Daniela; Li, Jingjing; Scholthaus, Sarah; Shi, Dunfu [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Xhonneux, Andre; Allelein, Hans-Josef [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany)

    2014-05-01

    The HTR code package (HCP) is a new code system, which couples a variety of stand-alone codes for the simulation of different aspects of HTR. HCP will allow the steady-state and transient operating conditions of a 3D reactor core to be simulated including new features such as spatially resolved fission product release calculations or production and transport of graphite dust. For this code the latest programming techniques and standards are applied. As a first step an object-oriented data model was developed which features a high level of readability because it is based on problem-specific data types like Nuclide, Reaction, ReactionHandler, CrossSectionSet, etc. Those classes help to encapsulate and therefore hide specific implementations, which are not relevant with respect to physics. HCP will make use of one consistent data library for which an automatic generation tool was developed. The new data library consists of decay information, cross sections, fission yields, scattering matrices etc. for all available nuclides (e.g. ENDF/B-VII.1). The data can be stored in different formats such as binary, ASCII or XML. The new burn up code TNT (Topological Nuclide Transmutation) applies graph theory to represent nuclide chains and to minimize the calculation effort when solving the burn up equations. New features are the use of energy-dependent fission yields or the calculation of thermal power for decay, fission and capture reactions. With STACY (source term analysis code system) the fission product release for steady state as well as accident scenarios can be simulated for each fuel batch. For a full-core release calculation several thousand fuel elements are tracked while passing through the core. This models the stochastic behavior of a pebble bed in a realistic manner. In this paper we report on the current status of the HCP and present first results, which prove the applicability of the selected approach.

  11. Highly selective BSA imprinted polyacrylamide hydrogels facilitated by a metal-coding MIP approach.

    Science.gov (United States)

    El-Sharif, H F; Yapati, H; Kalluru, S; Reddy, S M

    2015-12-01

    We report the fabrication of metal-coded molecularly imprinted polymers (MIPs) using hydrogel-based protein imprinting techniques. A Co(II) complex was prepared using (E)-2-((2 hydrazide-(4-vinylbenzyl)hydrazono)methyl)phenol; along with iron(III) chloroprotoporphyrin (Hemin), vinylferrocene (VFc), zinc(II) protoporphyrin (ZnPP) and protoporphyrin (PP), these complexes were introduced into the MIPs as co-monomers for metal-coding of non-metalloprotein imprints. Results indicate a 66% enhancement for bovine serum albumin (BSA) protein binding capacities (Q, mg/g) via metal-ion/ligand exchange properties within the metal-coded MIPs. Specifically, Co(II)-complex-based MIPs exhibited 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/non-imprinted (NIP) control). The selectivity of our Co(II)-coded BSA MIPs were also tested using bovine haemoglobin (BHb), lysozyme (Lyz), and trypsin (Tryp). By evaluating imprinting factors (K), each of the latter proteins was found to have lower affinities in comparison to cognate BSA template. The hydrogels were further characterised by thermal analysis and differential scanning calorimetry (DSC) to assess optimum polymer composition. The development of hydrogel-based molecularly imprinted polymer (HydroMIPs) technology for the memory imprinting of proteins and for protein biosensor development presents many possibilities, including uses in bio-sample clean-up or selective extraction, replacement of biological antibodies in immunoassays and biosensors for medicine and the environment. Biosensors for proteins and viruses are currently expensive to develop because they require the use of expensive antibodies. Because of their biomimicry capabilities (and their potential to act as synthetic antibodies), HydroMIPs potentially offer a route to the development of new low-cost biosensors. Herein, a metal ion-mediated imprinting approach was employed to metal-code our

  12. NFAT5 regulates HIV-1 in primary monocytes via a highly conserved long terminal repeat site.

    Directory of Open Access Journals (Sweden)

    Shahin Ranjbar

    2006-12-01

    Full Text Available To replicate, HIV-1 capitalizes on endogenous cellular activation pathways resulting in recruitment of key host transcription factors to its viral enhancer. RNA interference has been a powerful tool for blocking key checkpoints in HIV-1 entry into cells. Here we apply RNA interference to HIV-1 transcription in primary macrophages, a major reservoir of the virus, and specifically target the transcription factor NFAT5 (nuclear factor of activated T cells 5, which is the most evolutionarily divergent NFAT protein. By molecularly cloning and sequencing isolates from multiple viral subtypes, and performing DNase I footprinting, electrophoretic mobility shift, and promoter mutagenesis transfection assays, we demonstrate that NFAT5 functionally interacts with a specific enhancer binding site conserved in HIV-1, HIV-2, and multiple simian immunodeficiency viruses. Using small interfering RNA to ablate expression of endogenous NFAT5 protein, we show that the replication of three major HIV-1 viral subtypes (B, C, and E is dependent upon NFAT5 in human primary differentiated macrophages. Our results define a novel host factor-viral enhancer interaction that reveals a new regulatory role for NFAT5 and defines a functional DNA motif conserved across HIV-1 subtypes and representative simian immunodeficiency viruses. Inhibition of the NFAT5-LTR interaction may thus present a novel therapeutic target to suppress HIV-1 replication and progression of AIDS.

  13. Proteomic Analysis of Pathogenic Fungi Reveals Highly Expressed Conserved Cell Wall Proteins

    Directory of Open Access Journals (Sweden)

    Jackson Champer

    2016-01-01

    Full Text Available We are presenting a quantitative proteomics tally of the most commonly expressed conserved fungal proteins of the cytosol, the cell wall, and the secretome. It was our goal to identify fungi-typical proteins that do not share significant homology with human proteins. Such fungal proteins are of interest to the development of vaccines or drug targets. Protein samples were derived from 13 fungal species, cultured in rich or in minimal media; these included clinical isolates of Aspergillus, Candida, Mucor, Cryptococcus, and Coccidioides species. Proteomes were analyzed by quantitative MSE (Mass Spectrometry—Elevated Collision Energy. Several thousand proteins were identified and quantified in total across all fractions and culture conditions. The 42 most abundant proteins identified in fungal cell walls or supernatants shared no to very little homology with human proteins. In contrast, all but five of the 50 most abundant cytosolic proteins had human homologs with sequence identity averaging 59%. Proteomic comparisons of the secreted or surface localized fungal proteins highlighted conserved homologs of the Aspergillus fumigatus proteins 1,3-β-glucanosyltransferases (Bgt1, Gel1-4, Crf1, Ecm33, EglC, and others. The fact that Crf1 and Gel1 were previously shown to be promising vaccine candidates, underlines the value of the proteomics data presented here.

  14. The conservation value of South East Asia's highly degraded forests: evidence from leaf-litter ants

    Science.gov (United States)

    Woodcock, Paul; Edwards, David P.; Fayle, Tom M.; Newton, Rob J.; Khen, Chey Vun; Bottrell, Simon H.; Hamer, Keith C.

    2011-01-01

    South East Asia is widely regarded as a centre of threatened biodiversity owing to extensive logging and forest conversion to agriculture. In particular, forests degraded by repeated rounds of intensive logging are viewed as having little conservation value and are afforded meagre protection from conversion to oil palm. Here, we determine the biological value of such heavily degraded forests by comparing leaf-litter ant communities in unlogged (natural) and twice-logged forests in Sabah, Borneo. We accounted for impacts of logging on habitat heterogeneity by comparing species richness and composition at four nested spatial scales, and examining how species richness was partitioned across the landscape in each habitat. We found that twice-logged forest had fewer species occurrences, lower species richness at small spatial scales and altered species composition compared with natural forests. However, over 80 per cent of species found in unlogged forest were detected within twice-logged forest. Moreover, greater species turnover among sites in twice-logged forest resulted in identical species richness between habitats at the largest spatial scale. While two intensive logging cycles have negative impacts on ant communities, these degraded forests clearly provide important habitat for numerous species and preventing their conversion to oil palm and other crops should be a conservation priority. PMID:22006966

  15. Characterization of high-power RF structures using time-domain field codes

    International Nuclear Information System (INIS)

    Shang, C.C.; DeFord, J.F.; Swatloski, T.L.

    1992-01-01

    We have modeled gyrotron windows and gyrotron amplifier sever structures for TE modes in the 100-150 GHz range and have computed the reflection and transmission characteristics from the field data. Good agreement with frequency domain codes and analytic analysis have been obtained for some simple geometries. We present results for realistic structures with lossy coatings and describe implementation of microwave diagnostics. (Author) 5 figs., 7 refs

  16. THR-TH: a high-temperature gas-cooled nuclear reactor core thermal hydraulics code

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1984-07-01

    The ORNL version of PEBBLE, the (RZ) pebble bed thermal hydraulics code, has been extended for application to a prismatic gas cooled reactor core. The supplemental treatment is of one-dimensional coolant flow in up to a three-dimensional core description. Power density data from a neutronics and exposure calculation are used as the basic information for the thermal hydraulics calculation of heat removal. Two-dimensional neutronics results may be expanded for a three-dimensional hydraulics calculation. The geometric description for the hydraulics problem is the same as used by the neutronics code. A two-dimensional thermal cell model is used to predict temperatures in the fuel channel. The capability is available in the local BOLD VENTURE computation system for reactor core analysis with capability to account for the effect of temperature feedback by nuclear cross section correlation. Some enhancements have also been added to the original code to add pebble bed modeling flexibility and to generate useful auxiliary results. For example, an estimate is made of the distribution of fuel temperatures based on average and extreme conditions regularly calculated at a number of locations.

  17. Code structure for U-Mo fuel performance analysis in high performance research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Gwan Yoon; Cho, Tae Won; Lee, Chul Min; Sohn, Dong Seong [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Kyu Hong; Park, Jong Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A performance analysis modeling applicable to research reactor fuel is being developed with available models describing fuel performance phenomena observed from in-pile tests. We established the calculation algorithm and scheme to best predict fuel performance using radio-thermo-mechanically coupled system to consider fuel swelling, interaction layer growth, pore formation in the fuel meat, and creep fuel deformation and mass relocation, etc. In this paper, we present a general structure of the performance analysis code for typical research reactor fuel and advanced features such as a model to predict fuel failure induced by combination of breakaway swelling and pore growth in the fuel meat. Thermo-mechanical code dedicated to the modeling of U-Mo dispersion fuel plates is being under development in Korea to satisfy a demand for advanced performance analysis and safe assessment of the plates. The major physical phenomena during irradiation are considered in the code such that interaction layer formation by fuel-matrix interdiffusion, fission induced swelling of fuel particle, mass relocation by fission induced stress, and pore formation at the interface between the reaction product and Al matrix.

  18. THR-TH: a high-temperature gas-cooled nuclear reactor core thermal hydraulics code

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1984-07-01

    The ORNL version of PEBBLE, the (RZ) pebble bed thermal hydraulics code, has been extended for application to a prismatic gas cooled reactor core. The supplemental treatment is of one-dimensional coolant flow in up to a three-dimensional core description. Power density data from a neutronics and exposure calculation are used as the basic information for the thermal hydraulics calculation of heat removal. Two-dimensional neutronics results may be expanded for a three-dimensional hydraulics calculation. The geometric description for the hydraulics problem is the same as used by the neutronics code. A two-dimensional thermal cell model is used to predict temperatures in the fuel channel. The capability is available in the local BOLD VENTURE computation system for reactor core analysis with capability to account for the effect of temperature feedback by nuclear cross section correlation. Some enhancements have also been added to the original code to add pebble bed modeling flexibility and to generate useful auxiliary results. For example, an estimate is made of the distribution of fuel temperatures based on average and extreme conditions regularly calculated at a number of locations

  19. LIDAR pulse coding for high resolution range imaging at improved refresh rate.

    Science.gov (United States)

    Kim, Gunzung; Park, Yongwan

    2016-10-17

    In this study, a light detection and ranging system (LIDAR) was designed that codes pixel location information in its laser pulses using the direct- sequence optical code division multiple access (DS-OCDMA) method in conjunction with a scanning-based microelectromechanical system (MEMS) mirror. This LIDAR can constantly measure the distance without idle listening time for the return of reflected waves because its laser pulses include pixel location information encoded by applying the DS-OCDMA. Therefore, this emits in each bearing direction without waiting for the reflected wave to return. The MEMS mirror is used to deflect and steer the coded laser pulses in the desired bearing direction. The receiver digitizes the received reflected pulses using a low-temperature-grown (LTG) indium gallium arsenide (InGaAs) based photoconductive antenna (PCA) and the time-to-digital converter (TDC) and demodulates them using the DS-OCDMA. When all of the reflected waves corresponding to the pixels forming a range image are received, the proposed LIDAR generates a point cloud based on the time-of-flight (ToF) of each reflected wave. The results of simulations performed on the proposed LIDAR are compared with simulations of existing LIDARs.

  20. MOSRA-Light; high speed three-dimensional nodal diffusion code for vector computers

    Energy Technology Data Exchange (ETDEWEB)

    Okumura, Keisuke [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-10-01

    MOSRA-Light is a three-dimensional neutron diffusion calculation code for X-Y-Z geometry. It is based on the 4th order polynomial nodal expansion method (NEM). As the 4th order NEM is not sensitive to mesh sizes, accurate calculation is possible by the use of coarse meshes of about 20 cm. The drastic decrease of number of unknowns in a 3-dimensional problem results in very fast computation. Furthermore, it employs newly developed computation algorithm `boundary separated checkerboard sweep method` appropriate to vector computers. This method is very efficient because the speedup factor by vectorization increases, as a scale of problem becomes larger. Speed-up factor compared to the scalar calculation is from 20 to 40 in the case of PWR core calculation. Considering the both effects by the vectorization and the coarse mesh method, total speedup factor is more than 1000 as compared with conventional scalar code with the finite difference method. MOSRA-Light can be available on most of vector or scalar computers with the UNIX or it`s similar operating systems (e.g. freeware like Linux). Users can easily install it by the help of the conversation style installer. This report contains the general theory of NEM, the fast computation algorithm, benchmark calculation results and detailed information for usage of this code including input data instructions and sample input data. (author)

  1. MOSRA-Light; high speed three-dimensional nodal diffusion code for vector computers

    International Nuclear Information System (INIS)

    Okumura, Keisuke

    1998-10-01

    MOSRA-Light is a three-dimensional neutron diffusion calculation code for X-Y-Z geometry. It is based on the 4th order polynomial nodal expansion method (NEM). As the 4th order NEM is not sensitive to mesh sizes, accurate calculation is possible by the use of coarse meshes of about 20 cm. The drastic decrease of number of unknowns in a 3-dimensional problem results in very fast computation. Furthermore, it employs newly developed computation algorithm 'boundary separated checkerboard sweep method' appropriate to vector computers. This method is very efficient because the speedup factor by vectorization increases, as a scale of problem becomes larger. Speed-up factor compared to the scalar calculation is from 20 to 40 in the case of PWR core calculation. Considering the both effects by the vectorization and the coarse mesh method, total speedup factor is more than 1000 as compared with conventional scalar code with the finite difference method. MOSRA-Light can be available on most of vector or scalar computers with the UNIX or it's similar operating systems (e.g. freeware like Linux). Users can easily install it by the help of the conversation style installer. This report contains the general theory of NEM, the fast computation algorithm, benchmark calculation results and detailed information for usage of this code including input data instructions and sample input data. (author)

  2. Impacts of Tropical Forest Disturbance Upon Avifauna on a Small Island with High Endemism: Implications for Conservation

    Directory of Open Access Journals (Sweden)

    Martin Thomas

    2010-01-01

    Full Text Available Tropical forests are rapidly being lost across Southeast Asia and this is predicted to have severe implications for many of the region′s bird species. However, relationships between forest disturbance and avifaunal assemblages remain poorly understood, particularly on small island ecosystems such as those found in the biodiversity ′hotspot′ of Wallacea. This study examines how avifaunal richness varies across a disturbance gradient in a forest reserve on Buton Island, southeast Sulawesi. Particular emphasis is placed upon examining responses in endemic and red-listed species with high conservation importance. Results indicate that overall avian richness increases between primary and 30-year-old regenerating secondary forest and then decreases through disturbed secondary forest, but is highest in cleared farmland. However, high species richness in farmland does not signify high species distinctiveness; bird community composition here differs significantly from that found in forest sites, and is poor in supporting forest specialists and endemic species. Certain large-bodied endemics such as the Knobbed Hornbill (Rhyticeros cassidix appear to be sensitive to moderate disturbance, with populations occurring at greatest density within primary forest. However, overall endemic species richness, as well as that of endemic frugivores and insectivores, is similar in primary and secondary forest types. Results indicate that well-established secondary forest in particular has an important role in supporting species with high conservational importance, possessing community composition similar to that found in primary forest and supporting an equally high richness of endemic species.

  3. Development of three-dimensional neoclassical transport simulation code with high performance Fortran on a vector-parallel computer

    International Nuclear Information System (INIS)

    Satake, Shinsuke; Okamoto, Masao; Nakajima, Noriyoshi; Takamaru, Hisanori

    2005-11-01

    A neoclassical transport simulation code (FORTEC-3D) applicable to three-dimensional configurations has been developed using High Performance Fortran (HPF). Adoption of computing techniques for parallelization and a hybrid simulation model to the δf Monte-Carlo method transport simulation, including non-local transport effects in three-dimensional configurations, makes it possible to simulate the dynamism of global, non-local transport phenomena with a self-consistent radial electric field within a reasonable computation time. In this paper, development of the transport code using HPF is reported. Optimization techniques in order to achieve both high vectorization and parallelization efficiency, adoption of a parallel random number generator, and also benchmark results, are shown. (author)

  4. Noise exposure and hearing conservation practices in an industry with high incidence of workers' compensation claims for hearing loss.

    Science.gov (United States)

    Daniell, William E; Swan, Susan S; McDaniel, Mary M; Stebbins, John G; Seixas, Noah S; Morgan, Michael S

    2002-10-01

    Washington State has experienced a striking increase in workers' compensation claims for hearing loss. This cross-sectional study examined noise exposures and hearing conservation practices in one industry with a high rate of hearing loss claims. We evaluated 10 representative foundries with personal noise dosimetry, management interviews, employee interviews, and existing audiometry. Noise levels routinely exceeded 85 dBA. All companies were out of compliance with hearing conservation regulations. Most employees with important findings on audiograms were not aware of their findings. There was a significant positive correlation between management-interview scores and worksite-average employee-interview scores (r = 0.70, P = 0.02). Companies where more effort is put into hearing conservation program activities can achieve a greater positive impact on employee awareness. However, there were broad deficiencies even in the better programs in this sample, suggesting that workers in this industry probably face a continuing substantial risk of occupational hearing loss. Copyright 2002 Wiley-Liss, Inc.

  5. Universal antibodies against the highly conserved influenza fusion peptide cross-neutralize several subtypes of influenza A virus

    Energy Technology Data Exchange (ETDEWEB)

    Hashem, Anwar M. [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Department of Microbiology, Faculty of Medicine, King Abdulaziz University, Jeddah (Saudi Arabia); Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON (Canada); Van Domselaar, Gary [National Microbiology Laboratory, Public Health Agency of Canada, Winnipeg, MB (Canada); Li, Changgui; Wang, Junzhi [National Institute for the Control of Pharmaceutical and Biological Products, Beijing (China); She, Yi-Min; Cyr, Terry D. [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Sui, Jianhua [Department of Cancer Immunology and AIDS, Dana-Farber Cancer Institute, Department of Medicine, Harvard Medical School, 44 Binney Street, Boston, MA 02115 (United States); He, Runtao [National Microbiology Laboratory, Public Health Agency of Canada, Winnipeg, MB (Canada); Marasco, Wayne A. [Department of Cancer Immunology and AIDS, Dana-Farber Cancer Institute, Department of Medicine, Harvard Medical School, 44 Binney Street, Boston, MA 02115 (United States); Li, Xuguang, E-mail: Sean.Li@hc-sc.gc.ca [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON (Canada)

    2010-12-10

    Research highlights: {yields} The fusion peptide is the only universally conserved epitope in all influenza viral hemagglutinins. {yields} Anti-fusion peptide antibodies are universal antibodies that cross-react with all influenza HA subtypes. {yields} The universal antibodies cross-neutralize different influenza A subtypes. {yields} The universal antibodies inhibit the fusion process between the viruses and the target cells. -- Abstract: The fusion peptide of influenza viral hemagglutinin plays a critical role in virus entry by facilitating membrane fusion between the virus and target cells. As the fusion peptide is the only universally conserved epitope in all influenza A and B viruses, it could be an attractive target for vaccine-induced immune responses. We previously reported that antibodies targeting the first 14 amino acids of the N-terminus of the fusion peptide could bind to virtually all influenza virus strains and quantify hemagglutinins in vaccines produced in embryonated eggs. Here we demonstrate that these universal antibodies bind to the viral hemagglutinins in native conformation presented in infected mammalian cell cultures and neutralize multiple subtypes of virus by inhibiting the pH-dependant fusion of viral and cellular membranes. These results suggest that this unique, highly-conserved linear sequence in viral hemagglutinin is exposed sufficiently to be attacked by the antibodies during the course of infection and merits further investigation because of potential importance in the protection against diverse strains of influenza viruses.

  6. Universal antibodies against the highly conserved influenza fusion peptide cross-neutralize several subtypes of influenza A virus

    International Nuclear Information System (INIS)

    Hashem, Anwar M.; Van Domselaar, Gary; Li, Changgui; Wang, Junzhi; She, Yi-Min; Cyr, Terry D.; Sui, Jianhua; He, Runtao; Marasco, Wayne A.; Li, Xuguang

    2010-01-01

    Research highlights: → The fusion peptide is the only universally conserved epitope in all influenza viral hemagglutinins. → Anti-fusion peptide antibodies are universal antibodies that cross-react with all influenza HA subtypes. → The universal antibodies cross-neutralize different influenza A subtypes. → The universal antibodies inhibit the fusion process between the viruses and the target cells. -- Abstract: The fusion peptide of influenza viral hemagglutinin plays a critical role in virus entry by facilitating membrane fusion between the virus and target cells. As the fusion peptide is the only universally conserved epitope in all influenza A and B viruses, it could be an attractive target for vaccine-induced immune responses. We previously reported that antibodies targeting the first 14 amino acids of the N-terminus of the fusion peptide could bind to virtually all influenza virus strains and quantify hemagglutinins in vaccines produced in embryonated eggs. Here we demonstrate that these universal antibodies bind to the viral hemagglutinins in native conformation presented in infected mammalian cell cultures and neutralize multiple subtypes of virus by inhibiting the pH-dependant fusion of viral and cellular membranes. These results suggest that this unique, highly-conserved linear sequence in viral hemagglutinin is exposed sufficiently to be attacked by the antibodies during the course of infection and merits further investigation because of potential importance in the protection against diverse strains of influenza viruses.

  7. A high-order relaxation method with projective integration for solving nonlinear systems of hyperbolic conservation laws

    Science.gov (United States)

    Lafitte, Pauline; Melis, Ward; Samaey, Giovanni

    2017-07-01

    We present a general, high-order, fully explicit relaxation scheme which can be applied to any system of nonlinear hyperbolic conservation laws in multiple dimensions. The scheme consists of two steps. In a first (relaxation) step, the nonlinear hyperbolic conservation law is approximated by a kinetic equation with stiff BGK source term. Then, this kinetic equation is integrated in time using a projective integration method. After taking a few small (inner) steps with a simple, explicit method (such as direct forward Euler) to damp out the stiff components of the solution, the time derivative is estimated and used in an (outer) Runge-Kutta method of arbitrary order. We show that, with an appropriate choice of inner step size, the time step restriction on the outer time step is similar to the CFL condition for the hyperbolic conservation law. Moreover, the number of inner time steps is also independent of the stiffness of the BGK source term. We discuss stability and consistency, and illustrate with numerical results (linear advection, Burgers' equation and the shallow water and Euler equations) in one and two spatial dimensions.

  8. Efficient random access high resolution region-of-interest (ROI) image retrieval using backward coding of wavelet trees (BCWT)

    Science.gov (United States)

    Corona, Enrique; Nutter, Brian; Mitra, Sunanda; Guo, Jiangling; Karp, Tanja

    2008-03-01

    Efficient retrieval of high quality Regions-Of-Interest (ROI) from high resolution medical images is essential for reliable interpretation and accurate diagnosis. Random access to high quality ROI from codestreams is becoming an essential feature in many still image compression applications, particularly in viewing diseased areas from large medical images. This feature is easier to implement in block based codecs because of the inherent spatial independency of the code blocks. This independency implies that the decoding order of the blocks is unimportant as long as the position for each is properly identified. In contrast, wavelet-tree based codecs naturally use some interdependency that exploits the decaying spectrum model of the wavelet coefficients. Thus one must keep track of the decoding order from level to level with such codecs. We have developed an innovative multi-rate image subband coding scheme using "Backward Coding of Wavelet Trees (BCWT)" which is fast, memory efficient, and resolution scalable. It offers far less complexity than many other existing codecs including both, wavelet-tree, and block based algorithms. The ROI feature in BCWT is implemented through a transcoder stage that generates a new BCWT codestream containing only the information associated with the user-defined ROI. This paper presents an efficient technique that locates a particular ROI within the BCWT coded domain, and decodes it back to the spatial domain. This technique allows better access and proper identification of pathologies in high resolution images since only a small fraction of the codestream is required to be transmitted and analyzed.

  9. A new world monkey microsatellite (ap74) highly conserved in primates

    International Nuclear Information System (INIS)

    Oklander, Luciana Ines; Steinberg, Eliana Ruth; Dolores Mudry, Marta

    2012-01-01

    Given their great variability, microsatellites or STRS became the most commonly used genetic markers over the last 15 years. The analysis of these markers requires minimum quantities of DNA, allowing the use of noninvasive samples, such as feces or hair. We amplified the microsatellite ap74 in blood and hair samples in order to analyze the levels of genomic conservation among a wide range of primates including: lemur catta, alouatta caraya, ateles belzebuth, ateles chamek, pan troglodytes, papio sp., and Homo sapiens. in all cases we obtained amplification products that exhibited similar size both in monkeys and human (oscillating between 126 and 176 bp), except in the lemur where the detected fragment presented a size of approximately 1000 bp. the analysis of the nucleotide sequences permitted the evaluation of the molecular modifications experienced during the evolutionary process in primates.

  10. Application of Chimera Navier-Stokes Code for High Speed Flows

    Science.gov (United States)

    Ajmani, Kumud

    1997-01-01

    The primary task for this year was performed in support of the "Trailblazer" project. The purpose of the task was to perform an extensive CFD study of the shock boundary-layer interaction between the engine-diverters and the primary body surfaces of the Trailblazer vehicle. Information gathered from this study would be used to determine the effectiveness of the diverters in preventing the boundary-layer coming off of the vehicle forebody from entering the main engines. The PEGSUS code was used to define the "holes" and "boundaries" for each grid. Two sets of CFD calculations were performed.Extensive post-processing of the results was performed.

  11. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  12. Thick target benchmark test for the code used in the design of high intensity proton accelerator project

    International Nuclear Information System (INIS)

    Meigo, Shin-ichiro; Harada, Masatoshi

    2003-01-01

    In the neutronics design for the JAERI and KEK Joint high intensity accelerator facilities, transport codes of NMTC/JAM, MCNPX and MARS are used. In order to confirm the predict ability for these code, it is important to compare with the experiment result. For the validation of the source term of neutron, the calculations are compared with the experimental spectrum of neutrons produced from thick target, which are carried out at LANL and KEK. As for validation of low energy incident case, the calculations are compared with experiment carried out at LANL, in which target of C, Al, Fe, and 238 U are irradiated with 256-MeV protons. By the comparison, it is found that both NMTC/JAM and MCNPX show good agreement with the experiment within by a factor of 2. MARS shows good agreement for C and Al target. MARS, however, gives rather underestimation for all targets in the neutron energy region higher than 30 MeV. For the validation high incident energy case, the codes are compared with the experiment carried out at KEK. In this experiment, W and Pb targets are bombarded with 0.5- and 1.5-GeV protons. Although slightly disagreement exists, NMTC/JAM, MCNPX and MARS are in good agreement with the experiment within by a factor of 2. (author)

  13. Environmental remediation of high-level nuclear waste in geological repository. Modified computer code creates ultimate benchmark in natural systems

    International Nuclear Information System (INIS)

    Peter, Geoffrey J.

    2011-01-01

    Isolation of high-level nuclear waste in permanent geological repositories has been a major concern for over 30 years due to the migration of dissolved radio nuclides reaching the water table (10,000-year compliance period) as water moves through the repository and the surrounding area. Repositories based on mathematical models allow for long-term geological phenomena and involve many approximations; however, experimental verification of long-term processes is impossible. Countries must determine if geological disposal is adequate for permanent storage. Many countries have extensively studied different aspects of safely confining the highly radioactive waste in an underground repository based on the unique geological composition at their selected repository location. This paper discusses two computer codes developed by various countries to study the coupled thermal, mechanical, and chemical process in these environments, and the migration of radionuclide. Further, this paper presents the results of a case study of the Magma-hydrothermal (MH) computer code, modified by the author, applied to nuclear waste repository analysis. The MH code verified by simulating natural systems thus, creating the ultimate benchmark. This approach based on processes similar to those expected near waste repositories currently occurring in natural systems. (author)

  14. Parallelization of MCNP 4, a Monte Carlo neutron and photon transport code system, in highly parallel distributed memory type computer

    International Nuclear Information System (INIS)

    Masukawa, Fumihiro; Takano, Makoto; Naito, Yoshitaka; Yamazaki, Takao; Fujisaki, Masahide; Suzuki, Koichiro; Okuda, Motoi.

    1993-11-01

    In order to improve the accuracy and calculating speed of shielding analyses, MCNP 4, a Monte Carlo neutron and photon transport code system, has been parallelized and measured of its efficiency in the highly parallel distributed memory type computer, AP1000. The code has been analyzed statically and dynamically, then the suitable algorithm for parallelization has been determined for the shielding analysis functions of MCNP 4. This includes a strategy where a new history is assigned to the idling processor element dynamically during the execution. Furthermore, to avoid the congestion of communicative processing, the batch concept, processing multi-histories by a unit, has been introduced. By analyzing a sample cask problem with 2,000,000 histories by the AP1000 with 512 processor elements, the 82 % of parallelization efficiency is achieved, and the calculational speed has been estimated to be around 50 times as fast as that of FACOM M-780. (author)

  15. Hearing sensitivity in context: Conservation implications for a highly vocal endangered species

    Directory of Open Access Journals (Sweden)

    Megan A. Owen

    2016-04-01

    Full Text Available Hearing sensitivity is a fundamental determinant of a species’ vulnerability to anthropogenic noise, however little is known about the hearing capacities of most conservation dependent species. When audiometric data are integrated with other aspects of species’ acoustic ecology, life history, and characteristic habitat topography and soundscape, predictions can be made regarding probable vulnerability to the negative impacts of different types of anthropogenic noise. Here we used an adaptive psychoacoustic technique to measure hearing thresholds in the endangered giant panda; a species that uses acoustic communication to coordinate reproduction. Our results suggest that giant pandas have functional hearing into the ultrasonic range, with good sensitivity between 10.0 and 16.0 kHz, and best sensitivity measured at 12.5–14.0 kHz. We estimated the lower and upper limits of functional hearing as 0.10 and 70.0 kHz respectively. While these results suggest that panda hearing is similar to that of some other terrestrial carnivores, panda hearing thresholds above 14.0 kHz were significantly lower (i.e., more sensitive than those of the polar bear, the only other bear species for which data are available. We discuss the implications of this divergence, as well as the relationship between hearing sensitivity and the spectral parameters of panda vocalizations. We suggest that these data, placed in context, can be used towards the development of a sensory-based model of noise disturbance for the species.

  16. Effect of ultra high temperature ceramics as fuel cladding materials on the nuclear reactor performance by SERPENT Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Korkut, Turgay; Kara, Ayhan; Korkut, Hatun [Sinop Univ. (Turkey). Dept. of Nuclear Energy Engineering

    2016-12-15

    Ultra High Temperature Ceramics (UHTCs) have low density and high melting point. So they are useful materials in the nuclear industry especially reactor core design. Three UHTCs (silicon carbide, vanadium carbide, and zirconium carbide) were evaluated as the nuclear fuel cladding materials. The SERPENT Monte Carlo code was used to model CANDU, PWR, and VVER type reactor core and to calculate burnup parameters. Some changes were observed at the same burnup and neutronic parameters (keff, neutron flux, absorption rate, and fission rate, depletion of U-238, U-238, Xe-135, Sm-149) with the use of these UHTCs. Results were compared to conventional cladding material zircalloy.

  17. A new phase coding method using a slice selection gradient for high speed flow velocity meaurements in NMR tomography

    International Nuclear Information System (INIS)

    Oh, C.H.; Cho, Z.H.; California Univ., Irvine

    1986-01-01

    A new phase coding method using a selection gradient for high speed NMR flow velocity measurements is introduced and discussed. To establish a phase-velocity relationship of flow under the slice selection gradient and spin-echo RF pulse, the Bloch equation was numerically solved under the assumption that only one directional flow exists, i.e. in the direction of slice selection. Details of the numerical solution of the Bloch equation and techniques related to the numerical computations are also given. Finally, using the numerical calculation, high speed flow velocity measurement was attempted and found to be in good agreement with other complementary controlled measurements. (author)

  18. 77 FR 11785 - Energy Conservation Program: Public Meeting and Availability of the Framework Document for High...

    Science.gov (United States)

    2012-02-28

    ... standards for high-intensity discharge (HID) lamps. Accordingly, DOE will hold a public meeting to discuss..._standards/commercial/high_intensity_discharge_lamps.html . DATES: The Department will hold a public meeting... Technologies Program, Mailstop EE-2J, Framework Document for High-Intensity Discharge Lamps, EERE-2010-BT-STD...

  19. Surgical versus conservative treatment for high-risk stress fractures of the lower leg (anterior tibial cortex, navicular and fifth metatarsal base): a systematic review

    NARCIS (Netherlands)

    Mallee, Wouter H.; Weel, Hanneke; van Dijk, C. Niek; van Tulder, Maurits W.; Kerkhoffs, Gino M.; Lin, Chung-Wei Christine

    2015-01-01

    To compare surgical and conservative treatment for high-risk stress fractures of the anterior tibial cortex, navicular and proximal fifth metatarsal. Systematic searches of CENTRAL, MEDLINE, EMBASE, CINAHL, SPORTDiscus and PEDro were performed to identify relevant prospective and retrospective

  20. The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy

    CERN Document Server

    Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F

    2010-01-01

    Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...

  1. Low Complexity Approach for High Throughput Belief-Propagation based Decoding of LDPC Codes

    Directory of Open Access Journals (Sweden)

    BOT, A.

    2013-11-01

    Full Text Available The paper proposes a low complexity belief propagation (BP based decoding algorithm for LDPC codes. In spite of the iterative nature of the decoding process, the proposed algorithm provides both reduced complexity and increased BER performances as compared with the classic min-sum (MS algorithm, generally used for hardware implementations. Linear approximations of check-nodes update function are used in order to reduce the complexity of the BP algorithm. Considering this decoding approach, an FPGA based hardware architecture is proposed for implementing the decoding algorithm, aiming to increase the decoder throughput. FPGA technology was chosen for the LDPC decoder implementation, due to its parallel computation and reconfiguration capabilities. The obtained results show improvements regarding decoding throughput and BER performances compared with state-of-the-art approaches.

  2. Modeling of High pressure core spray system (HPCS) with TRAC-BF1 code

    International Nuclear Information System (INIS)

    Angel M, E. Del.

    1993-01-01

    In this work we present a model of the HPCS system of Laguna Verde Nuclear Power Plant (CNLV) which consist of a condensate storage tank (TAC) a vertical surge pipe (TOS), implemented by the Comision Federal de Electricidad (CFE), and the suppression pool (PSP), as well as the associated piping and valves for the HPCS pump suction, to study the system under transient state conditions. The analysis results show that the implemented surge pipe, allows a normal HPCS pump start without automatic inadvertented transfer of the HPCS pump suction from the condensate storage tank to the suppression pool. We briefly mention some problems found during the stimulation and their solution, further we point out some deficiencies of the code for this type of studies. The present model can be used to stimulate other transients with only minor modifications of it. (Author)

  3. Curli Fibers Are Highly Conserved between Salmonella typhimurium and Escherichia coli with Respect to Operon Structure and Regulation

    Science.gov (United States)

    Römling, Ute; Bian, Zhao; Hammar, Mårten; Sierralta, Walter D.; Normark, Staffan

    1998-01-01

    Mouse-virulent Salmonella typhimurium strains SR-11 and ATCC 14028-1s express curli fibers, thin aggregative fibers, at ambient temperature on plates as judged by Western blot analysis and electron microscopy. Concomitantly with curli expression, cells develop a rough and dry colony morphology and bind the dye Congo red (called the rdar morphotype). Cloning and characterization of the two divergently transcribed operons required for curli biogenesis, csgBA(C) and csgDEFG, from S. typhimurium SR-11 revealed the same gene order and flanking genes as in Escherichia coli. The divergence of the curli region between S. typhimurium and E. coli at the nucleotide level is above average (22.4%). However, a high level of conservation at the protein level, which ranged from 86% amino acid homology for the fiber subunit CsgA to 99% homology for the lipoprotein CsgG, implies functional constraints on the gene products. Consequently, S. typhimurium genes on low-copy-number plasmids were able to complement respective E. coli mutants, although not always to wild-type levels. rpoS and ompR are required for transcriptional activation of (at least) the csgD promoter. The high degree of conservation at the protein level and the identical regulation patterns in E. coli and S. typhimurium suggest similar roles of curli fibers in the same ecological niche in the two species. PMID:9457880

  4. Conserved host response to highly pathogenic avian influenza virus infection in human cell culture, mouse and macaque model systems

    Directory of Open Access Journals (Sweden)

    McDermott Jason E

    2011-11-01

    Full Text Available Abstract Background Understanding host response to influenza virus infection will facilitate development of better diagnoses and therapeutic interventions. Several different experimental models have been used as a proxy for human infection, including cell cultures derived from human cells, mice, and non-human primates. Each of these systems has been studied extensively in isolation, but little effort has been directed toward systematically characterizing the conservation of host response on a global level beyond known immune signaling cascades. Results In the present study, we employed a multivariate modeling approach to characterize and compare the transcriptional regulatory networks between these three model systems after infection with a highly pathogenic avian influenza virus of the H5N1 subtype. Using this approach we identified functions and pathways that display similar behavior and/or regulation including the well-studied impact on the interferon response and the inflammasome. Our results also suggest a primary response role for airway epithelial cells in initiating hypercytokinemia, which is thought to contribute to the pathogenesis of H5N1 viruses. We further demonstrate that we can use a transcriptional regulatory model from the human cell culture data to make highly accurate predictions about the behavior of important components of the innate immune system in tissues from whole organisms. Conclusions This is the first demonstration of a global regulatory network modeling conserved host response between in vitro and in vivo models.

  5. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system

    International Nuclear Information System (INIS)

    Yang, W.S.; Lee, C.H.

    2008-01-01

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC 2 -2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC 2 -2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC 2 -2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC 2 -2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC 2 -2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC 2 -2, VIM, and NJOY. For almost all nuclides considered, MC 2 -2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC 2 -2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC 2 -2/TWODANT calculations were in good agreement with MCNP solutions within ∼0.25% Δρ, except a few small LANL fast assemblies. Relative to the MCNP solution, the MC 2 -2/TWODANT

  6. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system.

    Energy Technology Data Exchange (ETDEWEB)

    Yang, W. S.; Lee, C. H. (Nuclear Engineering Division)

    2008-05-16

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC{sup 2}-2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC{sup 2}-2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC{sup 2}-2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC{sup 2}-2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC{sup 2}-2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC{sup 2}-2, VIM, and NJOY. For almost all nuclides considered, MC{sup 2}-2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC{sup 2}-2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC{sup 2}-2/TWODANT calculations were in good agreement with MCNP solutions within {approx}0.25% {Delta}{rho}, except a few small LANL fast assemblies

  7. Validation and verification of MCNP6 against intermediate and high-energy experimental data and results by other codes

    International Nuclear Information System (INIS)

    Mashnik, Stepan G.

    2011-01-01

    MCNP6, the latest and most advanced LANL transport code representing a recent merger of MCNP5 and MCNPX, has been Validated and Verified (V and V) against a variety of intermediate and high-energy experimental data and against results by different versions of MCNPX and other codes. In the present work, we V and V MCNP6 using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.02 and LAQGSM03.03. We found that MCNP6 describes reasonably well various reactions induced by particles and nuclei at incident energies from 18 MeV to about 1 TeV per nucleon measured on thin and thick targets and agrees very well with similar results obtained with MCNPX and calculations by CEM03.02, LAQGSM03.01 (03.03), INCL4 + ABLA, and Bertini INC + Dresner evaporation, EPAX, ABRABLA, HIPSE, and AMD, used as stand alone codes. Most of several computational bugs and more serious physics problems observed in MCNP6/X during our V and V have been fixed; we continue our work to solve all the known problems before MCNP6 is distributed to the public. (author)

  8. ZORNOC: a 1 1/2-D tokamak data analysis code for studying noncircular high beta plasmas

    International Nuclear Information System (INIS)

    Zurro, B.; Wieland, R.M.; Murakami, M.; Swain, D.W.

    1980-03-01

    A new tokamak data analysis code, ZORNOC, was developed to study noncircular, high beta plasmas in the Impurity Study Experiment (ISX-B). These plasmas exhibit significant flux surface shifts and elongation in both ohmically heated and beam-heated discharges. The MHD equilibrium flux surface geometry is determined by solving the Grad-Shafranov equation based on: (1) the shape of the outermost flux surface, deduced from the magnetic loop probes; (2) a pressure profile, deduced by means of Thomson scattering data (electrons), charge exchange data (ions), and a Fokker-Planck model (fast ions); and (3) a safety factor profile, determined from the experimental data using a simple model (Z/sub eff/ = const) that is self-consistently altered while the plasma equilibrium is iterated. For beam-heated discharches the beam deposition profile is determined by means of a Monte Carlo scheme and the slowing down of the fast ions by means of an analytical solution of the Fokker-Planck equation. The code also carries out an electron power balance and calculates various confinement parameters. The code is described and examples of its operation are given

  9. Recommendations for codes and standards to be used for design and fabrication of high level waste canister

    International Nuclear Information System (INIS)

    Bermingham, A.J.; Booker, R.J.; Booth, H.R.; Ruehle, W.G.; Shevekov, S.; Silvester, A.G.; Tagart, S.W.; Thomas, J.A.; West, R.G.

    1978-01-01

    This study identifies codes, standards, and regulatory requirements for developing design criteria for high-level waste (HLW) canisters for commercial operation. It has been determined that the canister should be designed as a pressure vessel without provision for any overpressure protection type devices. It is recommended that the HLW canister be designed and fabricated to the requirements of the ASME Section III Code, Division 1 rules, for Code Class 3 components. Identification of other applicable industry and regulatory guides and standards are provided in this report. Requirements for the Design Specification are found in the ASME Section III Code. It is recommended that design verification be conducted principally with prototype testing which will encompass normal and accident service conditions during all phases of the canister life. Adequacy of existing quality assurance and licensing standards for the canister was investigated. One of the recommendations derived from this study is a requirement that the canister be N stamped. In addition, acceptance standards for the HLW waste should be established and the waste qualified to those standards before the canister is sealed. A preliminary investigation of use of an overpack for the canister has been made, and it is concluded that the use of an overpack, as an integral part of overall canister design, is undesirable, both from a design and economics standpoint. However, use of shipping cask liners and overpack type containers at the Federal repository may make the canister and HLW management safer and more cost effective. There are several possible concepts for canister closure design. These concepts can be adapted to the canister with or without an overpack. A remote seal weld closure is considered to be one of the most suitable closure methods; however, mechanical seals should also be investigated

  10. High performance 3D neutron transport on peta scale and hybrid architectures within APOLLO3 code

    International Nuclear Information System (INIS)

    Jamelot, E.; Dubois, J.; Lautard, J-J.; Calvin, C.; Baudron, A-M.

    2011-01-01

    APOLLO3 code is a common project of CEA, AREVA and EDF for the development of a new generation system for core physics analysis. We present here the parallelization of two deterministic transport solvers of APOLLO3: MINOS, a simplified 3D transport solver on structured Cartesian and hexagonal grids, and MINARET, a transport solver based on triangular meshes on 2D and prismatic ones in 3D. We used two different techniques to accelerate MINOS: a domain decomposition method, combined with an accelerated algorithm using GPU. The domain decomposition is based on the Schwarz iterative algorithm, with Robin boundary conditions to exchange information. The Robin parameters influence the convergence and we detail how we optimized the choice of these parameters. MINARET parallelization is based on angular directions calculation using explicit message passing. Fine grain parallelization is also available for each angular direction using shared memory multithreaded acceleration. Many performance results are presented on massively parallel architectures using more than 103 cores and on hybrid architectures using some tens of GPUs. This work contributes to the HPC development in reactor physics at the CEA Nuclear Energy Division. (author)

  11. Modelling of high burnup structure in UO2 fuel with the RTOP code

    International Nuclear Information System (INIS)

    Likhanskii, V.; Zborovskii, V.; Evdokimov, I.; Kanyukova, V.; Sorokin, A.

    2008-01-01

    The present work deals with self-consistent physical approach aimed to derive the criterion of fuel restructuring avoiding correlations. The approach is based on study of large over pressurized bubbles formation on dislocations, at grain boundaries and in grain volume. At first, stage of formation of bubbles non-destroyable by fission fragments is examined using consistent modelling of point defects and fission gas behavior near dislocation and in grain volume. Then, evolution of formed large non-destroyable bubbles is considered using results of the previous step as initial values. Finally, condition of dislocation loops punching by sufficiently large over pressurized bubbles is regarded as the criterion of fuel restructuring onset. In the present work consideration of large over pressurized bubbles evolution is applied to modelling of the restructuring threshold depending on temperature, burnup and grain size. Effect of grain size predicted by the model is in qualitative agreement with experimental observations. Restructuring threshold criterion as an analytical function of local burnup and fuel temperature is derived and compared with HBRP project data. To predict rim-layer width formation depending on fuel burnup and irradiation conditions the model is implemented into the mechanistic fuel performance code RTOP. Calculated dependencies give upper estimate for the width of restructured region. Calculations show that one needs to consider temperature distribution within pellet which depends on irradiation history in order to model rim-structure formation

  12. Catchment-scale conservation units identified for the threatened Yarra pygmy perch (Nannoperca obscura) in highly modified river systems.

    Science.gov (United States)

    Brauer, Chris J; Unmack, Peter J; Hammer, Michael P; Adams, Mark; Beheregaray, Luciano B

    2013-01-01

    Habitat fragmentation caused by human activities alters metapopulation dynamics and decreases biological connectivity through reduced migration and gene flow, leading to lowered levels of population genetic diversity and to local extinctions. The threatened Yarra pygmy perch, Nannoperca obscura, is a poor disperser found in small, isolated populations in wetlands and streams of southeastern Australia. Modifications to natural flow regimes in anthropogenically-impacted river systems have recently reduced the amount of habitat for this species and likely further limited its opportunity to disperse. We employed highly resolving microsatellite DNA markers to assess genetic variation, population structure and the spatial scale that dispersal takes place across the distribution of this freshwater fish and used this information to identify conservation units for management. The levels of genetic variation found for N. obscura are amongst the lowest reported for a fish species (mean heterozygosity of 0.318 and mean allelic richness of 1.92). We identified very strong population genetic structure, nil to little evidence of recent migration among demes and a minimum of 11 units for conservation management, hierarchically nested within four major genetic lineages. A combination of spatial analytical methods revealed hierarchical genetic structure corresponding with catchment boundaries and also demonstrated significant isolation by riverine distance. Our findings have implications for the national recovery plan of this species by demonstrating that N. obscura populations should be managed at a catchment level and highlighting the need to restore habitat and avoid further alteration of the natural hydrology.

  13. High Elevation Refugia for Bombus terricola (Hymenoptera: Apidae) Conservation and Wild Bees of the White Mountain National Forest.

    Science.gov (United States)

    Tucker, Erika M; Rehan, Sandra M

    2017-01-01

    Many wild bee species are in global decline, yet much is still unknown about their diversity and contemporary distributions. National parks and forests offer unique areas of refuge important for the conservation of rare and declining species populations. Here we present the results of the first biodiversity survey of the bee fauna in the White Mountain National Forest (WMNF). More than a thousand specimens were collected from pan and sweep samples representing 137 species. Three species were recorded for the first time in New England and an additional seven species were documented for the first time in the state of New Hampshire. Four introduced species were also observed in the specimens collected. A checklist of the species found in the WMNF, as well as those found previously in Strafford County, NH, is included with new state records and introduced species noted as well as a map of collecting locations. Of particular interest was the relatively high abundance of Bombus terricola Kirby 1837 found in many of the higher elevation collection sites and the single specimen documented of Bombus fervidus (Fabricius 1798). Both of these bumble bee species are known to have declining populations in the northeast and are categorized as vulnerable on the International Union for Conservation of Nature's Red List. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America.

  14. Characterization of STIP, a multi-domain nuclear protein, highly conserved in metazoans, and essential for embryogenesis in Caenorhabditis elegans

    International Nuclear Information System (INIS)

    Ji Qiongmei; Huang, C.-H.; Peng Jianbin; Hashmi, Sarwar; Ye Tianzhang; Chen Ying

    2007-01-01

    We report here the identification and characterization of STIP, a multi-domain nuclear protein that contains a G-patch, a coiled-coil, and several short tryptophan-tryptophan repeats highly conserved in metazoan species. To analyze their functional role in vivo, we cloned nematode stip-1 genes and determined the spatiotemporal pattern of Caenorhabditis elegans STIP-1 protein. RNA analyses and Western blots revealed that stip-1 mRNA was produced via trans-splicing and translated as a 95-kDa protein. Using reporter constructs, we found STIP-1 to be expressed at all developmental stages and in many tissue/cell types including worm oocyte nuclei. We found that STIP-1 is targeted to the nucleus and forms large polymers with a rod-like shape when expressed in mammalian cells. Using deletion mutants, we mapped the regions of STIP-1 involved in nuclear import and polymer assembly. We further showed that knockdown of C. elegans stip-1 by RNA interference arrested development and resulted in morphologic abnormalities around the 16-cell stage followed by 100% lethality, suggesting its essential role in worm embryogenesis. Importantly, the embryonic lethal phenotype could be faithfully rescued with Drosophila and human genes via transgenic expression. Our data provide the first direct evidence that STIP have a conserved essential nuclear function across metazoans from worms to humans

  15. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); Xu, Guang-Hua [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2015-03-10

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n{sup n} with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method.

  16. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    International Nuclear Information System (INIS)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing; Xu, Guang-Hua

    2015-01-01

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n n with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method

  17. High Re-Operation Rates Using Conserve Metal-On-Metal Total Hip Articulations

    DEFF Research Database (Denmark)

    Mogensen, S L; Jakobsen, Thomas; Christoffersen, Hardy

    2016-01-01

    INTRODUCTION: Metal-on-metal hip articulations have been intensely debated after reports of adverse reactions and high failure rates. The aim of this study was to retrospectively evaluate the implant of a metal-on.metal total hip articulation (MOM THA) from a single manufacture in a two-center st......INTRODUCTION: Metal-on-metal hip articulations have been intensely debated after reports of adverse reactions and high failure rates. The aim of this study was to retrospectively evaluate the implant of a metal-on.metal total hip articulation (MOM THA) from a single manufacture in a two...

  18. High spatial resolution mapping of land cover types in a priority area for conservation in the Brazilian savanna

    Science.gov (United States)

    Ribeiro, F.; Roberts, D. A.; Hess, L. L.; Davis, F. W.; Caylor, K. K.; Nackoney, J.; Antunes Daldegan, G.

    2017-12-01

    Savannas are heterogeneous landscapes consisting of highly mixed land cover types that lack clear distinct boundaries. The Brazilian Cerrado is a Neotropical savanna considered a biodiversity hotspot for conservation due to its biodiversity richness and rapid transformation of its landscape by crop and pasture activities. The Cerrado is one of the most threatened Brazilian biomes and only 2.2% of its original extent is strictly protected. Accurate mapping and monitoring of its ecosystems and adjacent land use are important to select areas for conservation and to improve our understanding of the dynamics in this biome. Land cover mapping of savannas is difficult due to spectral similarity between land cover types resulting from similar vegetation structure, floristically similar components, generalization of land cover classes, and heterogeneity usually expressed as small patch sizes within the natural landscape. These factors are the major contributor to misclassification and low map accuracies among remote sensing studies in savannas. Specific challenges to map the Cerrado's land cover types are related to the spectral similarity between classes of land use and natural vegetation, such as natural grassland vs. cultivated pasture, and forest ecosystem vs. crops. This study seeks to classify and evaluate the land cover patterns across an area ranked as having extremely high priority for future conservation in the Cerrado. The main objective of this study is to identify the representativeness of each vegetation type across the landscape using high to moderate spatial resolution imagery using an automated scheme. A combination of pixel-based and object-based approaches were tested using RapidEye 3A imagery (5m spatial resolution) to classify the Cerrado's major land cover types. The random forest classifier was used to map the major ecosystems present across the area, and demonstrated to have an effective result with 68% of overall accuracy. Post

  19. Identification of a highly conserved valine-glycine-phenylalanine amino acid triplet required for HIV-1 Nef function

    Directory of Open Access Journals (Sweden)

    Meuwissen Pieter J

    2012-04-01

    Full Text Available Abstract Background The Nef protein of HIV facilitates virus replication and disease progression in infected patients. This role as pathogenesis factor depends on several genetically separable Nef functions that are mediated by interactions of highly conserved protein-protein interaction motifs with different host cell proteins. By studying the functionality of a series of nef alleles from clinical isolates, we identified a dysfunctional HIV group O Nef in which a highly conserved valine-glycine-phenylalanine (VGF region, which links a preceding acidic cluster with the following proline-rich motif into an amphipathic surface was deleted. In this study, we aimed to study the functional importance of this VGF region. Results The dysfunctional HIV group O8 nef allele was restored to the consensus sequence, and mutants of canonical (NL4.3, NA-7, SF2 and non-canonical (B2 and C1422 HIV-1 group M nef alleles were generated in which the amino acids of the VGF region were changed into alanines (VGF→AAA and tested for their capacity to interfere with surface receptor trafficking, signal transduction and enhancement of viral replication and infectivity. We found the VGF motif, and each individual amino acid of this motif, to be critical for downregulation of MHC-I and CXCR4. Moreover, Nef’s association with the cellular p21-activated kinase 2 (PAK2, the resulting deregulation of cofilin and inhibition of host cell actin remodeling, and targeting of Lck kinase to the trans-golgi-network (TGN were affected as well. Of particular interest, VGF integrity was essential for Nef-mediated enhancement of HIV virion infectivity and HIV replication in peripheral blood lymphocytes. For targeting of Lck kinase to the TGN and viral infectivity, especially the phenylalanine of the triplet was essential. At the molecular level, the VGF motif was required for the physical interaction of the adjacent proline-rich motif with Hck. Conclusion Based on these findings, we

  20. Ancient Exaptation of a CORE-SINE Retroposon into a Highly Conserved Mammalian Neuronal Enhancer of the Proopiomelanocortin Gene

    Science.gov (United States)

    Bumaschny, Viviana F; Low, Malcolm J; Rubinstein, Marcelo

    2007-01-01

    The proopiomelanocortin gene (POMC) is expressed in the pituitary gland and the ventral hypothalamus of all jawed vertebrates, producing several bioactive peptides that function as peripheral hormones or central neuropeptides, respectively. We have recently determined that mouse and human POMC expression in the hypothalamus is conferred by the action of two 5′ distal and unrelated enhancers, nPE1 and nPE2. To investigate the evolutionary origin of the neuronal enhancer nPE2, we searched available vertebrate genome databases and determined that nPE2 is a highly conserved element in placentals, marsupials, and monotremes, whereas it is absent in nonmammalian vertebrates. Following an in silico paleogenomic strategy based on genome-wide searches for paralog sequences, we discovered that opossum and wallaby nPE2 sequences are highly similar to members of the superfamily of CORE-short interspersed nucleotide element (SINE) retroposons, in particular to MAR1 retroposons that are widely present in marsupial genomes. Thus, the neuronal enhancer nPE2 originated from the exaptation of a CORE-SINE retroposon in the lineage leading to mammals and remained under purifying selection in all mammalian orders for the last 170 million years. Expression studies performed in transgenic mice showed that two nonadjacent nPE2 subregions are essential to drive reporter gene expression into POMC hypothalamic neurons, providing the first functional example of an exapted enhancer derived from an ancient CORE-SINE retroposon. In addition, we found that this CORE-SINE family of retroposons is likely to still be active in American and Australian marsupial genomes and that several highly conserved exonic, intronic and intergenic sequences in the human genome originated from the exaptation of CORE-SINE retroposons. Together, our results provide clear evidence of the functional novelties that transposed elements contributed to their host genomes throughout evolution. PMID:17922573

  1. Ancient exaptation of a CORE-SINE retroposon into a highly conserved mammalian neuronal enhancer of the proopiomelanocortin gene.

    Directory of Open Access Journals (Sweden)

    Andrea M Santangelo

    2007-10-01

    Full Text Available The proopiomelanocortin gene (POMC is expressed in the pituitary gland and the ventral hypothalamus of all jawed vertebrates, producing several bioactive peptides that function as peripheral hormones or central neuropeptides, respectively. We have recently determined that mouse and human POMC expression in the hypothalamus is conferred by the action of two 5' distal and unrelated enhancers, nPE1 and nPE2. To investigate the evolutionary origin of the neuronal enhancer nPE2, we searched available vertebrate genome databases and determined that nPE2 is a highly conserved element in placentals, marsupials, and monotremes, whereas it is absent in nonmammalian vertebrates. Following an in silico paleogenomic strategy based on genome-wide searches for paralog sequences, we discovered that opossum and wallaby nPE2 sequences are highly similar to members of the superfamily of CORE-short interspersed nucleotide element (SINE retroposons, in particular to MAR1 retroposons that are widely present in marsupial genomes. Thus, the neuronal enhancer nPE2 originated from the exaptation of a CORE-SINE retroposon in the lineage leading to mammals and remained under purifying selection in all mammalian orders for the last 170 million years. Expression studies performed in transgenic mice showed that two nonadjacent nPE2 subregions are essential to drive reporter gene expression into POMC hypothalamic neurons, providing the first functional example of an exapted enhancer derived from an ancient CORE-SINE retroposon. In addition, we found that this CORE-SINE family of retroposons is likely to still be active in American and Australian marsupial genomes and that several highly conserved exonic, intronic and intergenic sequences in the human genome originated from the exaptation of CORE-SINE retroposons. Together, our results provide clear evidence of the functional novelties that transposed elements contributed to their host genomes throughout evolution.

  2. Identification of immunogenic HLA-B7 "Achilles' heel" epitopes within highly conserved regions of HIV

    DEFF Research Database (Denmark)

    De Groot, Anne S; Rivera, Daniel S; McMurry, Julie A

    2008-01-01

    Genetic polymorphisms in class I human leukocyte antigen molecules (HLA) have been shown to determine susceptibility to HIV infection as well as the rate of progression to AIDS. In particular, the HLA-B7 supertype has been shown to be associated with high viral loads and rapid progression to dise...

  3. High performance fiber reinforced concrete : Progress in knowledge and design codes

    NARCIS (Netherlands)

    Walraven, J.C.

    2009-01-01

    High performance fiber reinforced concrete is developing quickly to a modern structural material with a high potential. As for instance testified by the recent symposium on HPFRC in Kassel, Germany (April 2008) the number of structural applications increases. At this moment studies are carried out

  4. Reduced-order LPV model of flexible wind turbines from high fidelity aeroelastic codes

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher; Sønderby, Ivan Bergquist; Hansen, Morten Hartvig

    2013-01-01

    of high-order linear time invariant (LTI) models. Firstly, the high-order LTI models are locally approximated using modal and balanced truncation and residualization. Then, an appropriate coordinate transformation is applied to allow interpolation of the model matrices between points on the parameter...

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  6. Optimization and parallelization of the thermal–hydraulic subchannel code CTF for high-fidelity multi-physics applications

    International Nuclear Information System (INIS)

    Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.

    2015-01-01

    Highlights: • COBRA-TF was adopted by the Consortium for Advanced Simulation of LWRs. • We have improved code performance to support running large-scale LWR simulations. • Code optimization has led to reductions in execution time and memory usage. • An MPI parallelization has reduced full-core simulation time from days to minutes. - Abstract: This paper describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis. A set of serial code optimizations—including fixing computational inefficiencies, optimizing the numerical approach, and making smarter data storage choices—are first described and shown to reduce both execution time and memory usage by about a factor of ten. Next, a “single program multiple data” parallelization strategy targeting distributed memory “multiple instruction multiple data” platforms utilizing domain decomposition is presented. In this approach, data communication between processors is accomplished by inserting standard Message-Passing Interface (MPI) calls at strategic points in the code. The domain decomposition approach implemented assigns one MPI process to each fuel assembly, with each domain being represented by its own CTF input file. The creation of CTF input files, both for serial and parallel runs, is also fully automated through use of a pressurized water reactor (PWR) pre-processor utility that uses a greatly simplified set of user input compared with the traditional CTF input. To run CTF in

  7. A highly conserved Poc1 protein characterized in embryos of the hydrozoan Clytia hemisphaerica: localization and functional studies.

    Directory of Open Access Journals (Sweden)

    Cécile Fourrage

    Full Text Available Poc1 (Protein of Centriole 1 proteins are highly conserved WD40 domain-containing centriole components, well characterized in the alga Chlamydomonas, the ciliated protazoan Tetrahymena, the insect Drosophila and in vertebrate cells including Xenopus and zebrafish embryos. Functions and localizations related to the centriole and ciliary axoneme have been demonstrated for Poc1 in a range of species. The vertebrate Poc1 protein has also been reported to show an additional association with mitochondria, including enrichment in the specialized "germ plasm" region of Xenopus oocytes. We have identified and characterized a highly conserved Poc1 protein in the cnidarian Clytia hemisphaerica. Clytia Poc1 mRNA was found to be strongly expressed in eggs and early embryos, showing a punctate perinuclear localization in young oocytes. Fluorescence-tagged Poc1 proteins expressed in developing embryos showed strong localization to centrioles, including basal bodies. Anti-human Poc1 antibodies decorated mitochondria in Clytia, as reported in human cells, but failed to recognise endogenous or fluorescent-tagged Clytia Poc1. Injection of specific morpholino oligonucleotides into Clytia eggs prior to fertilization to repress Poc1 mRNA translation interfered with cell division from the blastula stage, likely corresponding to when neosynthesis normally takes over from maternally supplied protein. Cell cycle lengthening and arrest were observed, phenotypes consistent with an impaired centriolar biogenesis or function. The specificity of the defects could be demonstrated by injection of synthetic Poc1 mRNA, which restored normal development. We conclude that in Clytia embryos, Poc1 has an essentially centriolar localization and function.

  8. Computation and Analysis of High Rocky Slope Safety in a Water Conservancy Project

    Directory of Open Access Journals (Sweden)

    Meng Yang

    2015-01-01

    Full Text Available An integrated method, covering the actual monitoring analysis, practical geological model, and theoretical mathematical simulation model, is systematically proposed and successfully applied. Deformation characteristic of a unique high rocky slope was firstly analyzed from multiple angles and multiple layers by changeable elevations and distances. Arrangements of monitoring points were listed and monitoring equipment was designed to comprise a complete monitoring system. Present larger displacement was concluded for bottom larger displacement caused by water erosion and middle larger displacement formed by seepage. Temporal and spatial displacements rule study of multiple-points linkage effects with water factor proved this conclusion. To better excavate useful message and analyze the deep rule from the practical monitoring data, the slope geological model was conducted and rock mechanic parameters were researched. Finally, a unique three-dimensional finite element model was applied to approach the structure character using numerical simulations. The corresponding strength criterion was used to determine the safety coefficient by selecting a typical section. Subsequently, an integrated three-dimensional finite element model of the slope and dam was developed and more detailed deformation evolution mechanism was revealed. This study is expected to provide a powerful and systematic method to analyze very high, important, and dangerous slopes.

  9. Constrained instanton and baryon number non-conservation at high energies

    International Nuclear Information System (INIS)

    Sil'vestrov, P.G.

    1992-01-01

    The main subject of this paper is the calculation of corrections to instanton action ΔS∼(mρ) 4 log(mρ)/g 2 (ρ is the intanton radius) in the SU(2) Yang-Mills theory. The total cross section for baryon number violating processes at high energies is usually parametrized as σ tat ∝exp(4π/αF(ε)), where α=g 2 /4π, ε=√s/E 0 , E 0 =√6πm w /α. In the present paper the third nontrivial term of the F(ε) expansion is obtained. The unknown correction to F(ε) are expected to be of the order of ε 8/3 . The total cross section is extremely sensitive to the value of single instanton action. For sufficiently heavy Higgs boson the ρ-dependent part of the instanton action is changed drastically. 21 refs.; 1 fig

  10. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Science.gov (United States)

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  11. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Directory of Open Access Journals (Sweden)

    Behrang Barekatain

    Full Text Available In recent years, Random Network Coding (RNC has emerged as a promising solution for efficient Peer-to-Peer (P2P video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  12. Molecular evolution of vertebrate neurotrophins: co-option of the highly conserved nerve growth factor gene into the advanced snake venom arsenalf.

    Science.gov (United States)

    Sunagar, Kartik; Fry, Bryan Grieg; Jackson, Timothy N W; Casewell, Nicholas R; Undheim, Eivind A B; Vidal, Nicolas; Ali, Syed A; King, Glenn F; Vasudevan, Karthikeyan; Vasconcelos, Vitor; Antunes, Agostinho

    2013-01-01

    Neurotrophins are a diverse class of structurally related proteins, essential for neuronal development, survival, plasticity and regeneration. They are characterized by major family members, such as the nerve growth factors (NGF), brain-derived neurotrophic factors (BDNF) and neurotrophin-3 (NT-3), which have been demonstrated here to lack coding sequence variations and follow the regime of negative selection, highlighting their extremely important conserved role in vertebrate homeostasis. However, in stark contrast, venom NGF secreted as part of the chemical arsenal of the venomous advanced snake family Elapidae (and to a lesser extent Viperidae) have characteristics consistent with the typical accelerated molecular evolution of venom components. This includes a rapid rate of diversification under the significant influence of positive-selection, with the majority of positively-selected sites found in the secreted β-polypeptide chain (74%) and on the molecular surface of the protein (92%), while the core structural and functional residues remain highly constrained. Such focal mutagenesis generates active residues on the toxin molecular surface, which are capable of interacting with novel biological targets in prey to induce a myriad of pharmacological effects. We propose that caenophidian NGFs could participate in prey-envenoming by causing a massive release of chemical mediators from mast cells to mount inflammatory reactions and increase vascular permeability, thereby aiding the spread of other toxins and/or by acting as proapoptotic factors. Despite their presence in reptilian venom having been known for over 60 years, this is the first evidence that venom-secreted NGF follows the molecular evolutionary pattern of other venom components, and thus likely participates in prey-envenomation.

  13. Molecular Evolution of Vertebrate Neurotrophins: Co-Option of the Highly Conserved Nerve Growth Factor Gene into the Advanced Snake Venom Arsenalf

    Science.gov (United States)

    Sunagar, Kartik; Fry, Bryan Grieg; Jackson, Timothy N. W.; Casewell, Nicholas R.; Undheim, Eivind A. B.; Vidal, Nicolas; Ali, Syed A.; King, Glenn F.; Vasudevan, Karthikeyan; Vasconcelos, Vitor; Antunes, Agostinho

    2013-01-01

    Neurotrophins are a diverse class of structurally related proteins, essential for neuronal development, survival, plasticity and regeneration. They are characterized by major family members, such as the nerve growth factors (NGF), brain-derived neurotrophic factors (BDNF) and neurotrophin-3 (NT-3), which have been demonstrated here to lack coding sequence variations and follow the regime of negative selection, highlighting their extremely important conserved role in vertebrate homeostasis. However, in stark contrast, venom NGF secreted as part of the chemical arsenal of the venomous advanced snake family Elapidae (and to a lesser extent Viperidae) have characteristics consistent with the typical accelerated molecular evolution of venom components. This includes a rapid rate of diversification under the significant influence of positive-selection, with the majority of positively-selected sites found in the secreted β-polypeptide chain (74%) and on the molecular surface of the protein (92%), while the core structural and functional residues remain highly constrained. Such focal mutagenesis generates active residues on the toxin molecular surface, which are capable of interacting with novel biological targets in prey to induce a myriad of pharmacological effects. We propose that caenophidian NGFs could participate in prey-envenoming by causing a massive release of chemical mediators from mast cells to mount inflammatory reactions and increase vascular permeability, thereby aiding the spread of other toxins and/or by acting as proapoptotic factors. Despite their presence in reptilian venom having been known for over 60 years, this is the first evidence that venom-secreted NGF follows the molecular evolutionary pattern of other venom components, and thus likely participates in prey-envenomation. PMID:24312363

  14. Molecular evolution of vertebrate neurotrophins: co-option of the highly conserved nerve growth factor gene into the advanced snake venom arsenalf.

    Directory of Open Access Journals (Sweden)

    Kartik Sunagar

    Full Text Available Neurotrophins are a diverse class of structurally related proteins, essential for neuronal development, survival, plasticity and regeneration. They are characterized by major family members, such as the nerve growth factors (NGF, brain-derived neurotrophic factors (BDNF and neurotrophin-3 (NT-3, which have been demonstrated here to lack coding sequence variations and follow the regime of negative selection, highlighting their extremely important conserved role in vertebrate homeostasis. However, in stark contrast, venom NGF secreted as part of the chemical arsenal of the venomous advanced snake family Elapidae (and to a lesser extent Viperidae have characteristics consistent with the typical accelerated molecular evolution of venom components. This includes a rapid rate of diversification under the significant influence of positive-selection, with the majority of positively-selected sites found in the secreted β-polypeptide chain (74% and on the molecular surface of the protein (92%, while the core structural and functional residues remain highly constrained. Such focal mutagenesis generates active residues on the toxin molecular surface, which are capable of interacting with novel biological targets in prey to induce a myriad of pharmacological effects. We propose that caenophidian NGFs could participate in prey-envenoming by causing a massive release of chemical mediators from mast cells to mount inflammatory reactions and increase vascular permeability, thereby aiding the spread of other toxins and/or by acting as proapoptotic factors. Despite their presence in reptilian venom having been known for over 60 years, this is the first evidence that venom-secreted NGF follows the molecular evolutionary pattern of other venom components, and thus likely participates in prey-envenomation.

  15. Dress Codes Blues: An Exploration of Urban Students' Reactions to a Public High School Uniform Policy

    Science.gov (United States)

    DaCosta, Kneia

    2006-01-01

    This qualitative investigation explores the responses of 22 U.S. urban public high school students when confronted with their newly imposed school uniform policy. Specifically, the study assessed students' appraisals of the policy along with compliance and academic performance. Guided by ecological human development perspectives and grounded in…

  16. Mice selectively bred for high voluntary wheel-running behavior conserve more fat despite increased exercise.

    Science.gov (United States)

    Hiramatsu, Layla; Garland, Theodore

    2018-04-20

    Physical activity is an important component of energy expenditure, and acute changes in activity can lead to energy imbalances that affect body composition, even under ad libitum food availability. One example of acute increases in physical activity is four replicate, selectively-bred High Runner (HR) lines of mice that voluntarily run ~3-fold more wheel revolutions per day over 6-day trials and are leaner, as compared with four non-selected control (C) lines. We expected that voluntary exercise would increase food consumption, build lean mass, and reduce fat mass, but that these effects would likely differ between HR and C lines or between the sexes. We compared wheel running, cage activity, food consumption, and body composition between HR and C lines for young adults of both sexes, and examined interrelationships of those traits across 6 days of wheel access. Before wheel testing, HR mice weighed less than C, primarily due to reduced lean mass, and females were lighter than males, entirely due to lower lean mass. Over 6 days of wheel access, all groups tended to gain small amounts of lean mass, but lose fat mass. HR mice lost less fat than C mice, in spite of much higher activity levels, resulting in convergence to a fat mass of ~1.7 g for all 4 groups. HR mice consumed more food than C mice (with body mass as a covariate), even accounting for their higher activity levels. No significant sex-by-linetype interactions were observed for any of the foregoing traits. Structural equation models showed that the four sex-by-linetype groups differed considerably in the complex phenotypic architecture of these traits. Interrelationships among traits differed by genetic background and sex, lending support to the idea that recommendations regarding weight management, diet, and exercise may need to be tailored to the individual level. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  18. Expression and genomic analysis of midasin, a novel and highly conserved AAA protein distantly related to dynein

    Directory of Open Access Journals (Sweden)

    Gibbons I R

    2002-07-01

    Full Text Available Abstract Background The largest open reading frame in the Saccharomyces genome encodes midasin (MDN1p, YLR106p, an AAA ATPase of 560 kDa that is essential for cell viability. Orthologs of midasin have been identified in the genome projects for Drosophila, Arabidopsis, and Schizosaccharomyces pombe. Results Midasin is present as a single-copy gene encoding a well-conserved protein of ~600 kDa in all eukaryotes for which data are available. In humans, the gene maps to 6q15 and encodes a predicted protein of 5596 residues (632 kDa. Sequence alignments of midasin from humans, yeast, Giardia and Encephalitozoon indicate that its domain structure comprises an N-terminal domain (35 kDa, followed by an AAA domain containing six tandem AAA protomers (~30 kDa each, a linker domain (260 kDa, an acidic domain (~70 kDa containing 35–40% aspartate and glutamate, and a carboxy-terminal M-domain (30 kDa that possesses MIDAS sequence motifs and is homologous to the I-domain of integrins. Expression of hemagglutamin-tagged midasin in yeast demonstrates a polypeptide of the anticipated size that is localized principally in the nucleus. Conclusions The highly conserved structure of midasin in eukaryotes, taken in conjunction with its nuclear localization in yeast, suggests that midasin may function as a nuclear chaperone and be involved in the assembly/disassembly of macromolecular complexes in the nucleus. The AAA domain of midasin is evolutionarily related to that of dynein, but it appears to lack a microtubule-binding site.

  19. Toxicity and cosmetic result of partial breast high-dose-rate interstitial brachytherapy for conservatively operated early breast cancer

    International Nuclear Information System (INIS)

    Xiu Xia; Tripuraneni Prabhakar; Giap Huan; Lin Ray; Chu Colin

    2007-01-01

    Objective: Objective To study the method, side effects and cosmetic outcome of high- dose-rate (HDR) accelerated partial breast interstitial irradiation (APBI) alone in early stage breast cancer' after conservative surgery. Methods: From February 2002 to June 2003,47 breast cancer lesions from 46 patients suffering from stage I/II breast cancer were treated with HDR 192 Ir APBI after conservative surgery. All patients were over 40 year-old, with T1-2N0-1 (≤3 lymph nodes positive), surgical margin > 1-2 mm, but those having lobular or inflammatory breast cancer were excluded. HDR brachytherapy with 34 Gy, 10 fractions/5 days was used after surgery, toxic reaction and cosmetic outcome were observed in one month, 6 and 12 months respectively. Results: Follow up of 1846 months, 34 months was carried out for the whole group. During the treatment, acute reactions including: erythema, edema, tenderness and infection, all under I-II grade, none of III-IV grade were observed in 21 patients(46%); late toxicity reactions: skin fibrosis, breast tenderness, fat necrosis, and telangiectasia, totally 20 patients (43%) were observed: 2 patients in III grade but one patient received 6 cycle chemotherapy. The result of cosmetic outcome evaluation was excellent or good, at 6 months 95% and 12 months 98%, respectively, but there was no recurfence. Conclusions: Excellent and favorable cosmetic results are noted after APBI by interstitial alone. Acute and late reactions are few. Long term observation is necessary for the rate of' local control. (authors)

  20. Phase-coded multi-pulse technique for ultrasonic high-order harmonic imaging of biological tissues in vitro

    International Nuclear Information System (INIS)

    Ma Qingyu; Zhang Dong; Gong Xiufen; Ma Yong

    2007-01-01

    Second or higher order harmonic imaging shows significant improvement in image clarity but is degraded by low signal-noise ratio (SNR) compared with fundamental imaging. This paper presents a phase-coded multi-pulse technique to provide the enhancement of SNR for the desired high-order harmonic ultrasonic imaging. In this technique, with N phase-coded pulses excitation, the received Nth harmonic signal is enhanced by 20 log 10 N dB compared with that in the single-pulse mode, whereas the fundamental and other order harmonic components are efficiently suppressed to reduce image confusion. The principle of this technique is theoretically discussed based on the theory of the finite amplitude sound waves, and examined by measurements of the axial and lateral beam profiles as well as the phase shift of the harmonics. In the experimental imaging for two biological tissue specimens, a plane piston source at 2 MHz is used to transmit a sequence of multiple pulses with equidistant phase shift. The second to fifth harmonic images are obtained using this technique with N = 2 to 5, and compared with the images obtained at the fundamental frequency. Results demonstrate that this technique of relying on higher order harmonics seems to provide a better resolution and contrast of ultrasonic images

  1. Out-of-Core Computations of High-Resolution Level Sets by Means of Code Transformation

    DEFF Research Database (Denmark)

    Christensen, Brian Bunch; Nielsen, Michael Bang; Museth, Ken

    2012-01-01

    We propose a storage efficient, fast and parallelizable out-of-core framework for streaming computations of high resolution level sets. The fundamental techniques are skewing and tiling transformations of streamed level set computations which allow for the combination of interface propagation, re...... computations are now CPU bound and consequently the overall performance is unaffected by disk latency and bandwidth limitations. We demonstrate this with several benchmark tests that show sustained out-of-core throughputs close to that of in-core level set simulations....

  2. A high fidelity model and code generator for the simulation of BOP systems

    International Nuclear Information System (INIS)

    Galen, S.; Vinay, M.

    1993-01-01

    TOPMERET represents a significant advance in the modelling fidelity of Balance of Plant systems (BOP). It is extremely flexible and can accommodate a variety of systems, including main steam, feedwater, turbine, condenser, offgas, large volumes, such as the containment, and water systems such as service water. It handles both normal and abnormal operating scenarios, including pipe break accidents. It was tested successfully on various simulators, and meets the fidelity required of BOP system models so as to successfully integrate with the high level of control automation of European designs. (Z.S.) 1 ref

  3. Methods for Ensuring High Quality of Coding of Cause of Death. The Mortality Register to Follow Southern Urals Populations Exposed to Radiation.

    Science.gov (United States)

    Startsev, N; Dimov, P; Grosche, B; Tretyakov, F; Schüz, J; Akleyev, A

    2015-01-01

    To follow up populations exposed to several radiation accidents in the Southern Urals, a cause-of-death registry was established at the Urals Center capturing deaths in the Chelyabinsk, Kurgan and Sverdlovsk region since 1950. When registering deaths over such a long time period, quality measures need to be in place to maintain quality and reduce the impact of individual coders as well as quality changes in death certificates. To ensure the uniformity of coding, a method for semi-automatic coding was developed, which is described here. Briefly, the method is based on a dynamic thesaurus, database-supported coding and parallel coding by two different individuals. A comparison of the proposed method for organizing the coding process with the common procedure of coding showed good agreement, with, at the end of the coding process, 70  - 90% agreement for the three-digit ICD -9 rubrics. The semi-automatic method ensures a sufficiently high quality of coding by at the same time providing an opportunity to reduce the labor intensity inherent in the creation of large-volume cause-of-death registries.

  4. Establishment of a JSME code for the evaluation of high-cycle thermal fatigue in mixing tees

    International Nuclear Information System (INIS)

    Moriya, Shoichi; Fukuda, Toshihiko; Matsunaga, Tomoya; Hirayama, Hiroshi; Shiina, Kouji; Tanimoto, Koichi

    2004-01-01

    This paper describes a JSME code for high-cycle thermal fatigue evaluation by thermal striping in mixing tees with hot and cold water flows. The evaluation of thermal striping in a mixing tee has four steps to screen design parameters one-by-one according to the severity of the thermal load assessed from design conditions using several evaluation charts. In order to make these charts, visualization tests with acrylic pipes and temperature measurement tests with metal pipes were conducted. The influence of the configurations of mixing tees, flow velocity ratio, pipe diameter ratio and so on was examined from the results of the experiments. This paper makes a short mention of the process of providing these charts. (author)

  5. Quasi-optical converters for high-power gyrotrons: a brief review of physical models, numerical methods and computer codes

    International Nuclear Information System (INIS)

    Sabchevski, S; Zhelyazkov, I; Benova, E; Atanassov, V; Dankov, P; Thumm, M; Arnold, A; Jin, J; Rzesnicki, T

    2006-01-01

    Quasi-optical (QO) mode converters are used to transform electromagnetic waves of complex structure and polarization generated in gyrotron cavities into a linearly polarized, Gaussian-like beam suitable for transmission. The efficiency of this conversion as well as the maintenance of low level of diffraction losses are crucial for the implementation of powerful gyrotrons as radiation sources for electron-cyclotron-resonance heating of fusion plasmas. The use of adequate physical models, efficient numerical schemes and up-to-date computer codes may provide the high accuracy necessary for the design and analysis of these devices. In this review, we briefly sketch the most commonly used QO converters, the mathematical base they have been treated on and the basic features of the numerical schemes used. Further on, we discuss the applicability of several commercially available and free software packages, their advantages and drawbacks, for solving QO related problems

  6. MORECA: A computer code for simulating modular high-temperature gas-cooled reactor core heatup accidents

    International Nuclear Information System (INIS)

    Ball, S.J.

    1991-10-01

    The design features of the modular high-temperature gas-cooled reactor (MHTGR) have the potential to make it essentially invulnerable to damage from postulated core heatup accidents. This report describes the ORNL MORECA code, which was developed for analyzing postulated long-term core heatup scenarios for which active cooling systems used to remove afterheat following the accidents can be assumed to the unavailable. Simulations of long-term loss-of-forced-convection accidents, both with and without depressurization of the primary coolant, have shown that maximum core temperatures stay below the point at which any significant fuel failures and fission product releases are expected. Sensitivity studies also have been done to determine the effects of errors in the predictions due both to uncertainties in the modeling and to the assumptions about operational parameters. MORECA models the US Department of Energy reference design of a standard MHTGR

  7. Benchmarking of 3D space charge codes using direct phase space measurements from photoemission high voltage dc gun

    Directory of Open Access Journals (Sweden)

    Ivan V. Bazarov

    2008-10-01

    Full Text Available We present a comparison between space charge calculations and direct measurements of the transverse phase space of space charge dominated electron bunches from a high voltage dc photoemission gun followed by an emittance compensation solenoid magnet. The measurements were performed using a double-slit emittance measurement system over a range of bunch charge and solenoid current values. The data are compared with detailed simulations using the 3D space charge codes GPT and Parmela3D. The initial particle distributions were generated from measured transverse and temporal laser beam profiles at the photocathode. The beam brightness as a function of beam fraction is calculated for the measured phase space maps and found to approach within a factor of 2 the theoretical maximum set by the thermal energy and the accelerating field at the photocathode.

  8. Distinct activation phenotype of a highly conserved novel HLA-B57-restricted epitope during dengue virus infection.

    Science.gov (United States)

    Townsley, Elizabeth; Woda, Marcia; Thomas, Stephen J; Kalayanarooj, Siripen; Gibbons, Robert V; Nisalak, Ananda; Srikiatkhachorn, Anon; Green, Sharone; Stephens, Henry A F; Rothman, Alan L; Mathew, Anuja

    2014-01-01

    Variation in the sequence of T-cell epitopes between dengue virus (DENV) serotypes is believed to alter memory T-cell responses during second heterologous infections. We identified a highly conserved, novel, HLA-B57-restricted epitope on the DENV NS1 protein. We predicted higher frequencies of B57-NS1(26-34) -specific CD8(+) T cells in peripheral blood mononuclear cells from individuals undergoing secondary rather than primary DENV infection. However, high tetramer-positive T-cell frequencies during acute infection were seen in only one of nine subjects with secondary infection. B57-NS1(26-34) -specific and other DENV epitope-specific CD8(+) T cells, as well as total CD8(+) T cells, expressed an activated phenotype (CD69(+) and/or CD38(+)) during acute infection. In contrast, expression of CD71 was largely limited to DENV epitope-specific CD8(+) T cells. In vitro stimulation of cell lines indicated that CD71 expression was differentially sensitive to stimulation by homologous and heterologous variant peptides. CD71 may represent a useful marker of antigen-specific T-cell activation. © 2013 John Wiley & Sons Ltd.

  9. Implementation of Energy Code Controls Requirements in New Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hart, Philip R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hatten, Mike [Solarc Energy Group, LLC, Seattle, WA (United States); Jones, Dennis [Group 14 Engineering, Inc., Denver, CO (United States); Cooper, Matthew [Group 14 Engineering, Inc., Denver, CO (United States)

    2017-03-24

    Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research is to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.

  10. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    International Nuclear Information System (INIS)

    Valach, M.; Zymak, J.; Svoboda, R.

    1997-01-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs

  11. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    Energy Technology Data Exchange (ETDEWEB)

    Valach, M; Zymak, J; Svoboda, R [Nuclear Research Inst. Rez plc, Rez (Czech Republic)

    1997-08-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs.

  12. The development of high performance numerical simulation code for transient groundwater flow and reactive solute transport problems based on local discontinuous Galerkin method

    International Nuclear Information System (INIS)

    Suzuki, Shunichi; Motoshima, Takayuki; Naemura, Yumi; Kubo, Shin; Kanie, Shunji

    2009-01-01

    The authors develop a numerical code based on Local Discontinuous Galerkin Method for transient groundwater flow and reactive solute transport problems in order to make it possible to do three dimensional performance assessment on radioactive waste repositories at the earliest stage possible. Local discontinuous Galerkin Method is one of mixed finite element methods which are more accurate ones than standard finite element methods. In this paper, the developed numerical code is applied to several problems which are provided analytical solutions in order to examine its accuracy and flexibility. The results of the simulations show the new code gives highly accurate numeric solutions. (author)

  13. A Web-Based GIS for Reporting Water Usage in the High Plains Underground Water Conservation District

    Science.gov (United States)

    Jia, M.; Deeds, N.; Winckler, M.

    2012-12-01

    The High Plains Underground Water Conservation District (HPWD) is the largest and oldest of the Texas water conservation districts, and oversees approximately 1.7 million irrigated acres. Recent rule changes have motivated HPWD to develop a more automated system to allow owners and operators to report well locations, meter locations, meter readings, the association between meters and wells, and contiguous acres. INTERA, Inc. has developed a web-based interactive system for HPWD water users to report water usage and for the district to better manage its water resources. The HPWD web management system utilizes state-of-the-art GIS techniques, including cloud-based Amazon EC2 virtual machine, ArcGIS Server, ArcSDE and ArcGIS Viewer for Flex, to support web-based water use management. The system enables users to navigate to their area of interest using a well-established base-map and perform a variety of operations and inquiries against their spatial features. The application currently has six components: user privilege management, property management, water meter registration, area registration, meter-well association and water use report. The system is composed of two main databases: spatial database and non-spatial database. With the help of Adobe Flex application at the front end and ArcGIS Server as the middle-ware, the spatial feature geometry and attributes update will be reflected immediately in the back end. As a result, property owners, along with the HPWD staff, collaborate together to weave the fabric of the spatial database. Interactions between the spatial and non-spatial databases are established by Windows Communication Foundation (WCF) services to record water-use report, user-property associations, owner-area associations, as well as meter-well associations. Mobile capabilities will be enabled in the near future for field workers to collect data and synchronize them to the spatial database. The entire solution is built on a highly scalable cloud

  14. Conservation Value

    OpenAIRE

    Tisdell, Clement A.

    2010-01-01

    This paper outlines the significance of the concept of conservation value and discusses ways in which it is determined paying attention to views stemming from utilitarian ethics and from deontological ethics. The importance of user costs in relation to economic decisions about the conservation and use of natural resources is emphasised. Particular attention is given to competing views about the importance of conserving natural resources in order to achieve economic sustainability. This then l...

  15. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  16. Screening and Identification of putative long non coding RNAs from transcriptome data of a high yielding blackgram (Vigna mungo, Cv. T9

    Directory of Open Access Journals (Sweden)

    Pankaj Kumar Singh

    2018-04-01

    Full Text Available Blackgram (Vigna mungo is one of primary legumes cultivated throughout India, Cv.T9 being one of its common high yielding cultivar. This article reports RNA sequencing data and a pipeline for prediction of novel long non-coding RNAs from the sequenced data. The raw data generated during sequencing are available at Sequence Read Archive (SRA of NCBI with accession number- SRX1558530 Keywords: Blackgram, Long non-coding RNA, Legumes, RNA sequencing data

  17. Extension of the reactor dynamics code MGT-3D for pebblebed and blocktype high-temperature-reactors

    International Nuclear Information System (INIS)

    Shi, Dunfu

    2015-01-01

    The High Temperature Gas cooled Reactor (HTGR) is an improved, gas cooled nuclear reactor. It was chosen as one of the candidates of generation IV nuclear plants [1]. The reactor can be shut down automatically because of the negative reactivity feedback due to the temperature's increasing in designed accidents. It is graphite moderated and Helium cooled. The residual heat can be transferred out of the reactor core by inactive ways as conduction, convection, and thermal radiation during the accident. In such a way, a fuel temperature does not go beyond a limit at which major fission product release begins. In this thesis, the coupled neutronics and fluid mechanics code MGT-3D used for the steady state and time-dependent simulation of HTGRs, is enhanced and validated [2]. The fluid mechanics part is validated by SANA experiments in steady state cases as well as transient cases. The fuel temperature calculation is optimized by solving the heat conduction equation of the coated particles. It is applied in the steady state and transient simulation of PBMR, and the results are compared to the simulation with the old overheating model. New approaches to calculate the temperature profile of the fuel element of block-type HTGRs, and the calculation of the homogeneous conductivity of composite materials are introduced. With these new developments, MGT-3D is able to simulate block-type HTGRs as well. This extended MGT-3D is used to simulate a cuboid ceramic block heating experiment in the NACOK-II facility. The extended MGT-3D is also applied to LOFC and DLOFC simulation of GT-MHR. It is a fluid mechanics calculation with a given heat source. This calculation result of MGT-3D is verified with the calculation results of other codes. The design of the Japanese HTTR is introduced. The deterministic simulation of the LOFC experiment of HTTR is conducted with the Monte-Carlo code Serpent and MGT-3D, which is the LOFC Project organized by OECD/NEA [3]. With Serpent the burnup

  18. A study of fuel failure behavior in high burnup HTGR fuel. Analysis by STRESS3 and STAPLE codes

    International Nuclear Information System (INIS)

    Martin, David G.; Sawa, Kazuhiro; Ueta, Shouhei; Sumita, Junya

    2001-05-01

    In current high temperature gas-cooled reactors (HTGRs), Tri-isotropic coated fuel particles are employed as fuel. In safety design of the HTGR fuels, it is important to retain fission products within particles so that their release to primary coolant does not exceed an acceptable level. From this point of view, the basic design criteria for the fuel are to minimize the failure fraction of as-fabricated fuel coating layers and to prevent significant additional fuel failures during operation. This report attempts to model fuel behavior in irradiation tests using the U.K. codes STRESS3 and STAPLE. Test results in 91F-1A and HRB-22 capsules irradiation tests, which were carried out at the Japan Materials Testing Reactor of JAERI and at the High Flux Isotope Reactor of Oak Ridge National Laboratory, respectively, were employed in the calculation. The maximum burnup and fast neutron fluence were about 10%FIMA and 3 x 10 25 m -2 , respectively. The fuel for the irradiation tests was called high burnup fuel, whose target burnup and fast neutron fluence were higher than those of the first-loading fuel of the High Temperature Engineering Test Reactor. The calculation results demonstrated that if only mean fracture stress values of PyC and SiC are used in the calculation it is not possible to predict any particle failures, by which is meant when all three load bearing layers have failed. By contrast, when statistical variations in the fracture stresses and particle specifications are taken into account, as is done in the STAPLE code, failures can be predicted. In the HRB-22 irradiation test, it was concluded that the first two particles which had failed were defective in some way, but that the third and fourth failures can be accounted for by the pressure vessel model. In the 91F-1A irradiation test, the result showed that 1 or 2 particles had failed towards the end of irradiation in the upper capsule and no particles failed in the lower capsule. (author)

  19. Loss of a highly conserved sterile alpha motif domain gene (WEEP) results in pendulous branch growth in peach trees.

    Science.gov (United States)

    Hollender, Courtney A; Pascal, Thierry; Tabb, Amy; Hadiarto, Toto; Srinivasan, Chinnathambi; Wang, Wanpeng; Liu, Zhongchi; Scorza, Ralph; Dardick, Chris

    2018-05-15

    Plant shoots typically grow upward in opposition to the pull of gravity. However, exceptions exist throughout the plant kingdom. Most conspicuous are trees with weeping or pendulous branches. While such trees have long been cultivated and appreciated for their ornamental value, the molecular basis behind the weeping habit is not known. Here, we characterized a weeping tree phenotype in Prunus persica (peach) and identified the underlying genetic mutation using a genomic sequencing approach. Weeping peach tree shoots exhibited a downward elliptical growth pattern and did not exhibit an upward bending in response to 90° reorientation. The causative allele was found to be an uncharacterized gene, Ppa013325 , having a 1.8-Kb deletion spanning the 5' end. This gene, dubbed WEEP , was predominantly expressed in phloem tissues and encodes a highly conserved 129-amino acid protein containing a sterile alpha motif (SAM) domain. Silencing WEEP in the related tree species Prunus domestica (plum) resulted in more outward, downward, and wandering shoot orientations compared to standard trees, supporting a role for WEEP in directing lateral shoot growth in trees. This previously unknown regulator of branch orientation, which may also be a regulator of gravity perception or response, provides insights into our understanding of how tree branches grow in opposition to gravity and could serve as a critical target for manipulating tree architecture for improved tree shape in agricultural and horticulture applications. Copyright © 2018 the Author(s). Published by PNAS.

  20. Neuronal patterning of the tubular collar cord is highly conserved among enteropneusts but dissimilar to the chordate neural tube.

    Science.gov (United States)

    Kaul-Strehlow, Sabrina; Urata, Makoto; Praher, Daniela; Wanninger, Andreas

    2017-08-01

    A tubular nervous system is present in the deuterostome groups Chordata (cephalochordates, tunicates, vertebrates) and in the non-chordate Enteropneusta. However, the worm-shaped enteropneusts possess a less complex nervous system featuring only a short hollow neural tube, whereby homology to its chordate counterpart remains elusive. Since the majority of data on enteropneusts stem from the harrimaniid Saccoglossus kowalevskii, putative interspecific variations remain undetected resulting in an unreliable ground pattern that impedes homology assessments. In order to complement the missing data from another enteropneust family, we investigated expression of key neuronal patterning genes in the ptychoderid Balanoglossus misakiensis. The collar cord of B. misakiensis shows anterior Six3/6 and posterior Otx + Engrailed expression, in a region corresponding to the chordate brain. Neuronal Nk2.1/Nk2.2 expression is absent. Interestingly, we found median Dlx and lateral Pax6 expression domains, i.e., a condition that is reversed compared to chordates. Comparative analyses reveal that adult nervous system patterning is highly conserved among the enteropneust families Harrimaniidae, Spengelidae and Ptychoderidae. BmiDlx and BmiPax6 have no corresponding expression domains in the chordate brain, which may be indicative of independent acquisition of a tubular nervous system in Enteropneusta and Chordata.

  1. Assessment of some high-order finite difference schemes on the scalar conservation law with periodical conditions

    Directory of Open Access Journals (Sweden)

    Alina BOGOI

    2016-12-01

    Full Text Available Supersonic/hypersonic flows with strong shocks need special treatment in Computational Fluid Dynamics (CFD in order to accurately capture the discontinuity location and his magnitude. To avoid numerical instabilities in the presence of discontinuities, the numerical schemes must generate low dissipation and low dispersion error. Consequently, the algorithms used to calculate the time and space-derivatives, should exhibit a low amplitude and phase error. This paper focuses on the comparison of the numerical results obtained by simulations with some high resolution numerical schemes applied on linear and non-linear one-dimensional conservation low. The analytical solutions are provided for all benchmark tests considering smooth periodical conditions. All the schemes converge to the proper weak solution for linear flux and smooth initial conditions. However, when the flux is non-linear, the discontinuities may develop from smooth initial conditions and the shock must be correctly captured. All the schemes accurately identify the shock position, with the price of the numerical oscillation in the vicinity of the sudden variation. We believe that the identification of this pure numerical behavior, without physical relevance, in 1D case is extremely useful to avoid problems related to the stability and convergence of the solution in the general 3D case.

  2. A highly Conserved Aspartic Acid Residue of the Chitosanase from Bacillus Sp. TS Is Involved in the Substrate Binding.

    Science.gov (United States)

    Zhou, Zhanping; Zhao, Shuangzhi; Liu, Yang; Chang, Zhengying; Ma, Yanhe; Li, Jian; Song, Jiangning

    2016-11-01

    The chitosanase from Bacillus sp. TS (CsnTS) is an enzyme belonging to the glycoside hydrolase family 8. The sequence of CsnTS shares 98 % identity with the chitosanase from Bacillus sp. K17. Crystallography analysis and site-direct mutagenesis of the chitosanase from Bacillus sp. K17 identified the important residues involved in the catalytic interaction and substrate binding. However, despite progress in understanding the catalytic mechanism of the chitosanase from the family GH8, the functional roles of some residues that are highly conserved throughout this family have not been fully elucidated. This study focused on one of these residues, i.e., the aspartic acid residue at position 318. We found that apart from asparagine, mutation of Asp318 resulted in significant loss of enzyme activity. In-depth investigations showed that mutation of this residue not only impaired enzymatic activity but also affected substrate binding. Taken together, our results showed that Asp318 plays an important role in CsnTS activity.

  3. The highly conserved codon following the slippery sequence supports -1 frameshift efficiency at the HIV-1 frameshift site.

    Directory of Open Access Journals (Sweden)

    Suneeth F Mathew

    Full Text Available HIV-1 utilises -1 programmed ribosomal frameshifting to translate structural and enzymatic domains in a defined proportion required for replication. A slippery sequence, U UUU UUA, and a stem-loop are well-defined RNA features modulating -1 frameshifting in HIV-1. The GGG glycine codon immediately following the slippery sequence (the 'intercodon' contributes structurally to the start of the stem-loop but has no defined role in current models of the frameshift mechanism, as slippage is inferred to occur before the intercodon has reached the ribosomal decoding site. This GGG codon is highly conserved in natural isolates of HIV. When the natural intercodon was replaced with a stop codon two different decoding molecules-eRF1 protein or a cognate suppressor tRNA-were able to access and decode the intercodon prior to -1 frameshifting. This implies significant slippage occurs when the intercodon is in the (perhaps distorted ribosomal A site. We accommodate the influence of the intercodon in a model of frame maintenance versus frameshifting in HIV-1.

  4. The implications of RCRA [Resource Conservation and Recovery Act] regulation for the disposal of transuranic and high-level waste

    International Nuclear Information System (INIS)

    Sigmon, C.F.; Sharples, F.E.; Smith, E.D.

    1988-01-01

    In May of 1987 the Department of Energy (DOE) published a rule interpreting the definition of ''byproduct'' under the Atomic Energy Act. This byproduct rule clarified the role of the Resource Conservation and Recovery Act (RCRA) in the regulation of DOE's radioactive waste management activities. According to the rule, only the radioactive portion of DOE's mixed radioactive and hazardous waste (mixed waste), including mixed transuranic (TRU) and high-level waste (HLW), is exempt from RCRA under the byproduct exemption. The portion of a waste that is hazardous as defined by RCRA is subject to full regulation under RCRA. Because the radioactive and hazardous portions of m any, if not most, DOE wastes are likely to be inseparable, the rule in effect makes most mixed wastes subject to dual regulation. The potential application of RCRA to facilities such as the Waste Isolation Pilot Plant (WIPP) and the HLW repository creates unique challenges for both the DOE and regulatory authorities. Strategies must be developed to assure compliance with RCRA without either causing excessive administrative burdens or abandoning the goal of minimizing radiation exposure. This paper will explore some of the potential regulatory options for and recent trends in the regulation of TRU and HLW under RCRA

  5. Highly conserved serine residue 40 in HIV-1 p6 regulates capsid processing and virus core assembly

    Directory of Open Access Journals (Sweden)

    Solbak Sara MØ

    2011-02-01

    Full Text Available Abstract Background The HIV-1 p6 Gag protein regulates the final abscission step of nascent virions from the cell membrane by the action of two late assembly (L- domains. Although p6 is located within one of the most polymorphic regions of the HIV-1 gag gene, the 52 amino acid peptide binds at least to two cellular budding factors (Tsg101 and ALIX, is a substrate for phosphorylation, ubiquitination, and sumoylation, and mediates the incorporation of the HIV-1 accessory protein Vpr into viral particles. As expected, known functional domains mostly overlap with several conserved residues in p6. In this study, we investigated the importance of the highly conserved serine residue at position 40, which until now has not been assigned to any known function of p6. Results Consistently with previous data, we found that mutation of Ser-40 has no effect on ALIX mediated rescue of HIV-1 L-domain mutants. However, the only feasible S40F mutation that preserves the overlapping pol open reading frame (ORF reduces virus replication in T-cell lines and in human lymphocyte tissue cultivated ex vivo. Most intriguingly, L-domain mediated virus release is not dependent on the integrity of Ser-40. However, the S40F mutation significantly reduces the specific infectivity of released virions. Further, it was observed that mutation of Ser-40 selectively interferes with the cleavage between capsid (CA and the spacer peptide SP1 in Gag, without affecting cleavage of other Gag products. This deficiency in processing of CA, in consequence, led to an irregular morphology of the virus core and the formation of an electron dense extra core structure. Moreover, the defects induced by the S40F mutation in p6 can be rescued by the A1V mutation in SP1 that generally enhances processing of the CA-SP1 cleavage site. Conclusions Overall, these data support a so far unrecognized function of p6 mediated by Ser-40 that occurs independently of the L-domain function, but selectively

  6. Development of safety analysis codes and experimental validation for a very high temperature gas-cooled reactor Final report

    International Nuclear Information System (INIS)

    Chang Oh

    2006-01-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 900 C and operational fuel temperatures above 1250 C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR's higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gases (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. Research Objectives As described above, a pipe break may lead to significant fuel damage and fission product release in the VHTR. The objectives of this Korean/United States collaboration were to develop and validate advanced computational methods for VHTR safety analysis. The methods that have been developed are now

  7. Development of safety analysis codes and experimental validation for a very high temperature gas-cooled reactor Final report

    Energy Technology Data Exchange (ETDEWEB)

    Chang Oh

    2006-03-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gasses (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. Research Objectives As described above, a pipe break may lead to significant fuel damage and fission product release in the VHTR. The objectives of this Korean/United States collaboration were to develop and validate advanced computational methods for VHTR safety analysis. The methods that have been developed are now

  8. High conservation level of CD8(+) T cell immunogenic regions within an unusual H1N2 human influenza variant.

    Science.gov (United States)

    Komadina, Naomi; Quiñones-Parra, Sergio M; Kedzierska, Katherine; McCaw, James M; Kelso, Anne; Leder, Karin; McVernon, Jodie

    2016-10-01

    Current seasonal influenza vaccines require regular updates due to antigenic drift causing loss of effectiveness and therefore providing little or no protection against novel influenza A subtypes. Next generation vaccines capable of eliciting CD8(+) T cell (CTL) mediated cross-protective immunity may offer a long-term alternative strategy. However, measuring pre- and existing levels of CTL cross-protection in humans is confounded by differences in infection histories across individuals. During 2000-2003, H1N2 viruses circulated persistently in the human population for the first time and we hypothesized that the viral nucleoprotein (NP) contained novel CTL epitopes that may have contributed to the survival of the viruses. This study describes the immunogenic NP peptides of H1N1, H2N2, and H3N2 influenza viruses isolated from humans over the past century, 1918-2003, by comparing this historical dataset to reference NP peptides from H1N2 that circulated in humans during 2000-2003. Observed peptides sequences ranged from highly conserved (15%) to highly variable (12%), with variation unrelated to reported immunodominance. No unique NP peptides which were exclusive to the H1N2 viruses were noted. However, the virus had inherited the NP from a recently emerged H3N2 variant containing novel peptides, which may have assisted its persistence. Any advantage due to this novelty was subsequently lost with emergence of a newer H3N2 variant in 2003. Our approach has potential to provide insight into the population context in which influenza viruses emerge, and may help to inform immunogenic peptide selection for CTL-inducing influenza vaccines. J. Med. Virol. 88:1725-1732, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. An Extension to the Constructivist Coding Hypothesis as a Learning Model for Selective Feedback when the Base Rate Is High

    Science.gov (United States)

    Ghaffarzadegan, Navid; Stewart, Thomas R.

    2011-01-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…

  10. HEXEREI: a multi-channel heat conduction convection code for use in transient thermal hydraulic analysis of high-temperature, gas-cooled reactors. Interim report

    International Nuclear Information System (INIS)

    Giles, G.E.; DeVault, R.M.; Turner, W.D.; Becker, B.R.

    1976-05-01

    A description is given of the development and verification of a generalized coupled conduction-convection, multichannel heat transfer computer program to analyze specific safety questions involving high temperature gas-cooled reactors (HTGR). The HEXEREI code was designed to provide steady-state and transient heat transfer analysis of the HTGR active core using a basic hexagonal mesh and multichannel coolant flow. In addition, the core auxiliary cooling systems were included in the code to provide more complete analysis of the reactor system during accidents involving reactor trip and cooling down on the auxiliary systems. Included are brief descriptions of the components of the HEXEREI code and sample HEXEREI analyses compared with analytical solutions and other heat transfer codes

  11. Surgical versus conservative treatment for high-risk stress fractures of the lower leg (anterior tibial cortex, navicular and fifth metatarsal base): a systematic review

    NARCIS (Netherlands)

    Mallee, W.H.; Weel, H.; van Dijk, C.N.; van Tulder, M.W.; Kerkhoffs, G.M.; Lin, C.W.C.

    2015-01-01

    Aim To compare surgical and conservative treatment for high-risk stress fractures of the anterior tibial cortex, navicular and proximal fifth metatarsal. Methods Systematic searches of CENTRAL, MEDLINE, EMBASE, CINAHL, SPORTDiscus and PEDro were performed to identify relevant prospective and

  12. Comparing the cost-effectiveness of water conservation policies in a depleting aquifer:A dynamic analysis of the Kansas High Plains

    Science.gov (United States)

    This research analyzes two groundwater conservation policies in the Kansas High Plains located within the Ogallala aquifer: 1) cost-share assistance to increase irrigation efficiency; and 2) incentive payments to convert irrigated crop production to dryland crop production. To compare the cost-effec...

  13. Spatially conserved regulatory elements identified within human and mouse Cd247 gene using high-throughput sequencing data from the ENCODE project

    DEFF Research Database (Denmark)

    Pundhir, Sachin; Hannibal, Tine Dahlbæk; Bang-Berthelsen, Claus Heiner

    2014-01-01

    . In this study, we have utilized the wealth of high-throughput sequencing data produced during the Encyclopedia of DNA Elements (ENCODE) project to identify spatially conserved regulatory elements within the Cd247 gene from human and mouse. We show the presence of two transcription factor binding sites...

  14. Encryption of QR code and grayscale image in interference-based scheme with high quality retrieval and silhouette problem removal

    Science.gov (United States)

    Qin, Yi; Wang, Hongjuan; Wang, Zhipeng; Gong, Qiong; Wang, Danchen

    2016-09-01

    In optical interference-based encryption (IBE) scheme, the currently available methods have to employ the iterative algorithms in order to encrypt two images and retrieve cross-talk free decrypted images. In this paper, we shall show that this goal can be achieved via an analytical process if one of the two images is QR code. For decryption, the QR code is decrypted in the conventional architecture and the decryption has a noisy appearance. Nevertheless, the robustness of QR code against noise enables the accurate acquisition of its content from the noisy retrieval, as a result of which the primary QR code can be exactly regenerated. Thereafter, a novel optical architecture is proposed to recover the grayscale image by aid of the QR code. In addition, the proposal has totally eliminated the silhouette problem existing in the previous IBE schemes, and its effectiveness and feasibility have been demonstrated by numerical simulations.

  15. A Resource Conservation Unit.

    Science.gov (United States)

    Porter, Philip D.

    1979-01-01

    Describes a variety of learning activities for teaching elementary and junior high students about air, water, and energy conservation techniques. Suggests community resources, social studies objectives, language skills, and 20 activities. (CK)

  16. Model My Watershed: A high-performance cloud application for public engagement, watershed modeling and conservation decision support

    Science.gov (United States)

    Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.

    2017-12-01

    The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.

  17. The maize INDETERMINATE1 flowering time regulator defines a highly conserved zinc finger protein family in higher plants

    Directory of Open Access Journals (Sweden)

    Colasanti Joseph

    2006-06-01

    Full Text Available Abstract Background The maize INDETERMINATE1 gene, ID1, is a key regulator of the transition to flowering and the founding member of a transcription factor gene family that encodes a protein with a distinct arrangement of zinc finger motifs. The zinc fingers and surrounding sequence make up the signature ID domain (IDD, which appears to be found in all higher plant genomes. The presence of zinc finger domains and previous biochemical studies showing that ID1 binds to DNA suggests that members of this gene family are involved in transcriptional regulation. Results Comparison of IDD genes identified in Arabidopsis and rice genomes, and all IDD genes discovered in maize EST and genomic databases, suggest that ID1 is a unique member of this gene family. High levels of sequence similarity amongst all IDD genes from maize, rice and Arabidopsis suggest that they are derived from a common ancestor. Several unique features of ID1 suggest that it is a divergent member of the maize IDD family. Although no clear ID1 ortholog was identified in the Arabidopsis genome, highly similar genes that encode proteins with identity extending beyond the ID domain were isolated from rice and sorghum. Phylogenetic comparisons show that these putative orthologs, along with maize ID1, form a group separate from other IDD genes. In contrast to ID1 mRNA, which is detected exclusively in immature leaves, several maize IDD genes showed a broad range of expression in various tissues. Further, Western analysis with an antibody that cross-reacts with ID1 protein and potential orthologs from rice and sorghum shows that all three proteins are detected in immature leaves only. Conclusion Comparative genomic analysis shows that the IDD zinc finger family is highly conserved among both monocots and dicots. The leaf-specific ID1 expression pattern distinguishes it from other maize IDD genes examined. A similar leaf-specific localization pattern was observed for the putative ID1 protein

  18. Software Project Management Plan for the Integrated Systems Code (ISC) of New Production Reactor -- Modular High Temperature Gas Reactor

    International Nuclear Information System (INIS)

    Taylor, D.

    1990-11-01

    The United States Department of Energy (DOE) has selected the Modular High Temperature Gas-Cooled Reactor (MHTGR) as one of the concepts for the New Production Reactor (NPR). DOE has also established several Technical Working Groups (TWG's) at the national laboratories to provide independent design confirmation of the NPR-MHTGR design. One of those TWG's is concerned with Thermal Fluid Flow (TFF) and analysis methods to provide independent design confirmation of the NPR-MHTGR. Analysis methods are also needed for operational safety evaluations, performance monitoring, sensitivity studies, and operator training. The TFF Program Plan includes, as one of its principal tasks, the development of a computer program (called the Integrated Systems Code, or ISC). This program will provide the needed long-term analysis capabilities for the MHTGR and its subsystems. This document presents the project management plan for development of the ISC. It includes the associated quality assurance tasks, and the schedule and resource requirements to complete these activities. The document conforms to the format of ANSI/IEEE Std. 1058.1-1987. 2 figs

  19. Expression of Mitochondrial Non-coding RNAs (ncRNAs) Is Modulated by High Risk Human Papillomavirus (HPV) Oncogenes*

    Science.gov (United States)

    Villota, Claudio; Campos, América; Vidaurre, Soledad; Oliveira-Cruz, Luciana; Boccardo, Enrique; Burzio, Verónica A.; Varas, Manuel; Villegas, Jaime; Villa, Luisa L.; Valenzuela, Pablo D. T.; Socías, Miguel; Roberts, Sally; Burzio, Luis O.

    2012-01-01

    The study of RNA and DNA oncogenic viruses has proved invaluable in the discovery of key cellular pathways that are rendered dysfunctional during cancer progression. An example is high risk human papillomavirus (HPV), the etiological agent of cervical cancer. The role of HPV oncogenes in cellular immortalization and transformation has been extensively investigated. We reported the differential expression of a family of human mitochondrial non-coding RNAs (ncRNAs) between normal and cancer cells. Normal cells express a sense mitochondrial ncRNA (SncmtRNA) that seems to be required for cell proliferation and two antisense transcripts (ASncmtRNAs). In contrast, the ASncmtRNAs are down-regulated in cancer cells. To shed some light on the mechanisms that trigger down-regulation of the ASncmtRNAs, we studied human keratinocytes (HFK) immortalized with HPV. Here we show that immortalization of HFK with HPV-16 or 18 causes down-regulation of the ASncmtRNAs and induces the expression of a new sense transcript named SncmtRNA-2. Transduction of HFK with both E6 and E7 is sufficient to induce expression of SncmtRNA-2. Moreover, E2 oncogene is involved in down-regulation of the ASncmtRNAs. Knockdown of E2 in immortalized cells reestablishes in a reversible manner the expression of the ASncmtRNAs, suggesting that endogenous cellular factors(s) could play functions analogous to E2 during non-HPV-induced oncogenesis. PMID:22539350

  20. Expression of mitochondrial non-coding RNAs (ncRNAs) is modulated by high risk human papillomavirus (HPV) oncogenes.

    Science.gov (United States)

    Villota, Claudio; Campos, América; Vidaurre, Soledad; Oliveira-Cruz, Luciana; Boccardo, Enrique; Burzio, Verónica A; Varas, Manuel; Villegas, Jaime; Villa, Luisa L; Valenzuela, Pablo D T; Socías, Miguel; Roberts, Sally; Burzio, Luis O

    2012-06-15

    The study of RNA and DNA oncogenic viruses has proved invaluable in the discovery of key cellular pathways that are rendered dysfunctional during cancer progression. An example is high risk human papillomavirus (HPV), the etiological agent of cervical cancer. The role of HPV oncogenes in cellular immortalization and transformation has been extensively investigated. We reported the differential expression of a family of human mitochondrial non-coding RNAs (ncRNAs) between normal and cancer cells. Normal cells express a sense mitochondrial ncRNA (SncmtRNA) that seems to be required for cell proliferation and two antisense transcripts (ASncmtRNAs). In contrast, the ASncmtRNAs are down-regulated in cancer cells. To shed some light on the mechanisms that trigger down-regulation of the ASncmtRNAs, we studied human keratinocytes (HFK) immortalized with HPV. Here we show that immortalization of HFK with HPV-16 or 18 causes down-regulation of the ASncmtRNAs and induces the expression of a new sense transcript named SncmtRNA-2. Transduction of HFK with both E6 and E7 is sufficient to induce expression of SncmtRNA-2. Moreover, E2 oncogene is involved in down-regulation of the ASncmtRNAs. Knockdown of E2 in immortalized cells reestablishes in a reversible manner the expression of the ASncmtRNAs, suggesting that endogenous cellular factors(s) could play functions analogous to E2 during non-HPV-induced oncogenesis.

  1. Incorporation of the Joule Heating of highly conducting materials into the Truchas code via an asymptotic approach

    Energy Technology Data Exchange (ETDEWEB)

    Akcay, Cihan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Haut, Terry Scot [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carlson, Neil N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-21

    The EM module of the Truchas code currently lacks the capability to model the Joule (Ohmic) heating of highly conducting materials that are inserted into induction furnaces from time to time to change the heating profile. This effect is difficult to simulate directly because of the requirement to resolve the extremely thin skin depth of good conductors, which is computationally costly. For example, copper has a skin depth, δ ~ 1 mm, for an oscillation frequency of tens of kHz. The industry is interested in determining what fraction of the heating power is lost to the Joule heating of these good conductors inserted inside the furnaces. The approach presented in this document is one of asymptotics where the leading order (unperturbed) solution is taken as that which emerges from solving the EM problem for a perfectly conducting insert. The conductor is treated as a boundary of the domain. The perturbative correction enters as a series expansion in terms of the dimensionless skin depth δ/L, where L is the characteristic size of the EM system. The correction at each order depends on the previous. This means that the leading order correction only depends on the unperturbed solution, in other words, it does not require Truchas to perform an additional EM field solve. Thus, the Joule heating can be captured by a clever leveraging of the existing tools in Truchas with only slight modifications.

  2. Adaptation of penelope Monte Carlo code system to the absorbed dose metrology: characterization of high energy photon beams and calculations of reference dosimeter correction factors

    International Nuclear Information System (INIS)

    Mazurier, J.

    1999-01-01

    This thesis has been performed in the framework of national reference setting-up for absorbed dose in water and high energy photon beam provided with the SATURNE-43 medical accelerator of the BNM-LPRI (acronym for National Bureau of Metrology and Primary standard laboratory of ionising radiation). The aim of this work has been to develop and validate different user codes, based on PENELOPE Monte Carlo code system, to determine the photon beam characteristics and calculate the correction factors of reference dosimeters such as Fricke dosimeters and graphite calorimeter. In the first step, the developed user codes have permitted the influence study of different components constituting the irradiation head. Variance reduction techniques have been used to reduce the calculation time. The phase space has been calculated for 6, 12 and 25 MV at the output surface level of the accelerator head, then used for calculating energy spectra and dose distributions in the reference water phantom. Results obtained have been compared with experimental measurements. The second step has been devoted to develop an user code allowing calculation correction factors associated with both BNM-LPRI's graphite and Fricke dosimeters thanks to a correlated sampling method starting with energy spectra obtained in the first step. Then the calculated correction factors have been compared with experimental and calculated results obtained with the Monte Carlo EGS4 code system. The good agreement, between experimental and calculated results, leads to validate simulations performed with the PENELOPE code system. (author)

  3. Some questions of using the algebraic coding theory for construction of special-purpose processors in high energy physics spectrometers

    International Nuclear Information System (INIS)

    Nikityuk, N.M.

    1989-01-01

    The results of investigations of using the algebraic coding theory for the creation of parallel encoders, majority coincidence schemes and coordinate processors for the first and second trigger levels are described. Concrete examples of calculation and structure of special-purpose processor using the table arithmetic method are given for multiplicity t ≤ 5. The question of using parallel and sequential syndrome coding methods for the registration of events with clusters is discussed. 30 refs.; 10 figs

  4. 3D code for simulations of fluid flows

    International Nuclear Information System (INIS)

    Skandera, D.

    2004-01-01

    In this paper, a present status in the development of the new numerical code is reported. The code is considered for simulations of fluid flows. The finite volume approach is adopted for solving standard fluid equations. They are treated in a conservative form to ensure a correct conservation of fluid quantities. Thus, a nonlinear hyperbolic system of conservation laws is numerically solved. The code uses the Eulerian description of the fluid and is designed as a high order central numerical scheme. The central approach employs no (approximate) Riemann solver and is less computational expensive. The high order WENO strategy is adopted in the reconstruction step to achieve results comparable with more accurate Riemann solvers. A combination of the central approach with an iterative solving of a local Riemann problem is tested and behaviour of such numerical flux is reported. An extension to three dimensions is implemented using a dimension by dimension approach, hence, no complicated dimensional splitting need to be introduced. The code is fully parallelized with the MPI library. Several standard hydrodynamic tests in one, two and three dimensions were performed and their results are presented. (author)

  5. Ex Situ gene conservation in high elevation white pine species in the United States-a beginning

    Science.gov (United States)

    Richard A. Sniezko; Anna Schoettle; Joan Dunlap; Detlev Vogler; David Conklin; Andrew Bower; Chris Jensen; Rob Mangold; Doug Daoust; Gary Man

    2011-01-01

    The eight white pine species native to the western United States face an array of biotic and abiotic challenges that impact the viability of populations or the species themselves. Well-established programs are already in place to conserve and restore Pinus monticola Dougl. ex D. Don and P. lambertiana Dougl. throughout significant portions of their geographic ranges....

  6. A Highly Conserved GEQYQQLR Epitope Has Been Identified in the Nucleoprotein of Ebola Virus by Using an In Silico Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Tuhin Ali

    2015-01-01

    Full Text Available Ebola virus (EBOV is a deadly virus that has caused several fatal outbreaks. Recently it caused another outbreak and resulted in thousands afflicted cases. Effective and approved vaccine or therapeutic treatment against this virus is still absent. In this study, we aimed to predict B-cell epitopes from several EBOV encoded proteins which may aid in developing new antibody-based therapeutics or viral antigen detection method against this virus. Multiple sequence alignment (MSA was performed for the identification of conserved region among glycoprotein (GP, nucleoprotein (NP, and viral structural proteins (VP40, VP35, and VP24 of EBOV. Next, different consensus immunogenic and conserved sites were predicted from the conserved region(s using various computational tools which are available in Immune Epitope Database (IEDB. Among GP, NP, VP40, VP35, and VP30 protein, only NP gave a 100% conserved GEQYQQLR B-cell epitope that fulfills the ideal features of an effective B-cell epitope and could lead a way in the milieu of Ebola treatment. However, successful in vivo and in vitro studies are prerequisite to determine the actual potency of our predicted epitope and establishing it as a preventing medication against all the fatal strains of EBOV.

  7. "Toward High School Biology": Helping Middle School Students Understand Chemical Reactions and Conservation of Mass in Nonliving and Living Systems

    Science.gov (United States)

    Herrmann-Abell, Cari F.; Koppal, Mary; Roseman, Jo Ellen

    2016-01-01

    Modern biology has become increasingly molecular in nature, requiring students to understand basic chemical concepts. Studies show, however, that many students fail to grasp ideas about atom rearrangement and conservation during chemical reactions or the application of these ideas to biological systems. To help provide students with a better…

  8. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  9. Identification of coding and non-coding mutational hotspots in cancer genomes.

    Science.gov (United States)

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  10. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  11. Inhibitory control and visuo-spatial reversibility in Piaget’s seminal number conservation task: A high-density ERP study.

    Directory of Open Access Journals (Sweden)

    Gregoire eBorst

    2013-12-01

    Full Text Available The present high-density ERP study on 13 adults aimed to determine whether number conservation relies on the ability to inhibit the overlearned length-equals-number strategy and then imagine the shortening of the row that was lengthened. Participants performed the number-conservation task and, after the EEG session, the mental imagery task. In the number-conservation task, first two rows with the same number of tokens and the same length were presented on a computer screen (COV condition and then, the tokens in one of the two rows were spread apart (INT condition. Participants were instructed to determine whether the two rows had an identical number of tokens. In the mental imagery task, two rows with different lengths but the same number of tokens were presented and participants were instructed to imagine the tokens in the longer row aligning with the tokens in the shorter row. In the number-conservation task, we found that the amplitudes of the centro-parietal N2 and fronto-central P3 were higher in the INT than in the COV conditions. In addition, the differences in response times between the two conditions were correlated with the differences in the amplitudes of the fronto-central P3. In light of previous results reported on the number-conservation task in adults, the present results suggest that inhibition might be necessary to succeed the number-conservation task in adults even when the transformation of the length of one of the row is displayed. Finally, we also reported correlations between the speed at which participants could imagine the shortening of one of the row in the mental imagery task, the speed at which participants could determine that the two rows had the same number of tokens after the tokens in one of the row were spread apart and the latency of the late positive parietal component in the number-conservation task. Therefore, performing the number-conservation task might involve mental transformation processes in adults.

  12. A zebrafish screen for craniofacial mutants identifies wdr68 as a highly conserved gene required for endothelin-1 expression

    Directory of Open Access Journals (Sweden)

    Amsterdam Adam

    2006-06-01

    identification of approximately 25% of the essential genes required for craniofacial development. The identification of zebrafish models for two human disease syndromes indicates that homologs to the other genes are likely to also be relevant for human craniofacial development. The initial characterization of wdr68 suggests an important role in craniofacial development for the highly conserved Wdr68-Dyrk1 protein complexes.

  13. Functional and genetic evidence that nucleoside transport is highly conserved in Leishmania species: Implications for pyrimidine-based chemotherapy.

    Science.gov (United States)

    Alzahrani, Khalid J H; Ali, Juma A M; Eze, Anthonius A; Looi, Wan Limm; Tagoe, Daniel N A; Creek, Darren J; Barrett, Michael P; de Koning, Harry P

    2017-08-01

    Leishmania pyrimidine salvage is replete with opportunities for therapeutic intervention with enzyme inhibitors or antimetabolites. Their uptake into cells depends upon specific transporters; therefore it is essential to establish whether various Leishmania species possess similar pyrimidine transporters capable of drug uptake. Here, we report a comprehensive characterization of pyrimidine transport in L. major and L. mexicana. In both species, two transporters for uridine/adenosine were detected, one of which also transported uracil and the antimetabolites 5-fluoruracil (5-FU) and 5F,2'deoxyuridine (5F,2'dUrd), and was designated uridine-uracil transporter 1 (UUT1); the other transporter mediated uptake of adenosine, uridine, 5F,2'dUrd and thymidine and was designated Nucleoside Transporter 1 (NT1). To verify the reported L. donovani model of two NT1-like genes encoding uridine/adenosine transporters, and an NT2 gene encoding an inosine transporter, we cloned the corresponding L. major and L. mexicana genes, expressing each in T. brucei. Consistent with the L. donovani reports, the NT1-like genes of either species mediated the adenosine-sensitive uptake of [ 3 H]-uridine but not of [ 3 H]-inosine. Conversely, the NT2-like genes mediated uptake of [ 3 H]-inosine but not [ 3 H]-uridine. Among pyrimidine antimetabolites tested, 5-FU and 5F,2'dUrd were the most effective antileishmanials; resistance to both analogs was induced in L. major and L. mexicana. In each case it was found that the resistant cells had lost the transport capacity for the inducing drug. Metabolomics analysis found that the mechanism of action of 5-FU and 5F-2'dUrd was similar in both Leishmania species, with major changes in deoxynucleotide metabolism. We conclude that the pyrimidine salvage system is highly conserved in Leishmania species - essential information for the development of pyrimidine-based chemotherapy. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights

  14. Functional and genetic evidence that nucleoside transport is highly conserved in Leishmania species: Implications for pyrimidine-based chemotherapy

    Directory of Open Access Journals (Sweden)

    Khalid J.H. Alzahrani

    2017-08-01

    Full Text Available Leishmania pyrimidine salvage is replete with opportunities for therapeutic intervention with enzyme inhibitors or antimetabolites. Their uptake into cells depends upon specific transporters; therefore it is essential to establish whether various Leishmania species possess similar pyrimidine transporters capable of drug uptake. Here, we report a comprehensive characterization of pyrimidine transport in L. major and L. mexicana. In both species, two transporters for uridine/adenosine were detected, one of which also transported uracil and the antimetabolites 5-fluoruracil (5-FU and 5F,2′deoxyuridine (5F,2′dUrd, and was designated uridine-uracil transporter 1 (UUT1; the other transporter mediated uptake of adenosine, uridine, 5F,2′dUrd and thymidine and was designated Nucleoside Transporter 1 (NT1. To verify the reported L. donovani model of two NT1-like genes encoding uridine/adenosine transporters, and an NT2 gene encoding an inosine transporter, we cloned the corresponding L. major and L. mexicana genes, expressing each in T. brucei. Consistent with the L. donovani reports, the NT1-like genes of either species mediated the adenosine-sensitive uptake of [3H]-uridine but not of [3H]-inosine. Conversely, the NT2-like genes mediated uptake of [3H]-inosine but not [3H]-uridine. Among pyrimidine antimetabolites tested, 5-FU and 5F,2′dUrd were the most effective antileishmanials; resistance to both analogs was induced in L. major and L. mexicana. In each case it was found that the resistant cells had lost the transport capacity for the inducing drug. Metabolomics analysis found that the mechanism of action of 5-FU and 5F-2′dUrd was similar in both Leishmania species, with major changes in deoxynucleotide metabolism. We conclude that the pyrimidine salvage system is highly conserved in Leishmania species - essential information for the development of pyrimidine-based chemotherapy. Keywords: Leishmania, Pyrimidine metabolism, Uracil

  15. Frequent LOH at hMLH1, a highly variable SNP in hMSH3, and negligible coding instability in ovarian cancer

    DEFF Research Database (Denmark)

    Arzimanoglou, I.I.; Hansen, L.L.; Chong, D.

    2002-01-01

    the mismatch DNA repair genes in ovarian cancer (OC), using a sensitive, accurate and reliable protocol we have developed. MATERIALS AND METHODS: A combination of high-resolution GeneScan software analysis and automated DNA cycle sequencing was used. RESULTS: Negligible coding MSI was observed in selected...

  16. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  17. Development of a neutronics code based on analytic function expansion nodal method for pebble-type High Temperature Gas-cooled Reactor design

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Nam Zin; Lee, Joo Hee; Lee, Jae Jun; Yu, Hui; Lee, Gil Soo [Korea Advanced Institute of Science and Tehcnology, Daejeon (Korea, Republic of)

    2006-03-15

    There is growing interest in developing Pebble Bed Reactors(PBRs) as a candidate of Very High Temperature gas-cooled Reactors(VHTRs). Until now, most existing methods of nuclear design analysis for this type of reactors are base on old finite-difference solvers or on statistical methods. And other existing nodal cannot be adapted for this kind of reactors because of transverse integration problem. In this project, we developed the TOPS code in three dimensional cylindrical geometry based on Analytic Function Expansion Nodal (AFEN) method developed at KAIST. The TOPS code showed better results in computing time than FDM and MCNP. Also TOPS showed very accurate results in reactor analysis.

  18. Development of a neutronics code based on analytic function expansion nodal method for pebble-type High Temperature Gas-cooled Reactor design

    International Nuclear Information System (INIS)

    Cho, Nam Zin; Lee, Joo Hee; Lee, Jae Jun; Yu, Hui; Lee, Gil Soo

    2006-03-01

    There is growing interest in developing Pebble Bed Reactors(PBRs) as a candidate of Very High Temperature gas-cooled Reactors(VHTRs). Until now, most existing methods of nuclear design analysis for this type of reactors are base on old finite-difference solvers or on statistical methods. And other existing nodal cannot be adapted for this kind of reactors because of transverse integration problem. In this project, we developed the TOPS code in three dimensional cylindrical geometry based on Analytic Function Expansion Nodal (AFEN) method developed at KAIST. The TOPS code showed better results in computing time than FDM and MCNP. Also TOPS showed very accurate results in reactor analysis

  19. Conservation endocrinology

    Science.gov (United States)

    McCormick, Stephen; Romero, L. Michael

    2017-01-01

    Endocrinologists can make significant contributions to conservation biology by helping to understand the mechanisms by which organisms cope with changing environments. Field endocrine techniques have advanced rapidly in recent years and can provide substantial information on the growth, stress, and reproductive status of individual animals, thereby providing insight into current and future responses of populations to changes in the environment. Environmental stressors and reproductive status can be detected nonlethally by measuring a number of endocrine-related endpoints, including steroids in plasma, living and nonliving tissue, urine, and feces. Information on the environmental or endocrine requirements of individual species for normal growth, development, and reproduction will provide critical information for species and ecosystem conservation. For many taxa, basic information on endocrinology is lacking, and advances in conservation endocrinology will require approaches that are both “basic” and “applied” and include integration of laboratory and field approaches.

  20. Multidisciplinary approach for the esophageal carcinoma with intent to conserve the esophagus centering on high-dose radiotherapy and concurrent chemotherapy

    International Nuclear Information System (INIS)

    Murakami, Masao; Kuroda, Yasumasa; Okamoto, Yoshiaki

    1997-01-01

    Forty-seven patients with operable squamous cell carcinoma of the thoracic esophagus were treated by initial concurrent chemoradiotherapy (CDDP-5 FU-44 Gy) followed by definitive high-dose of radiotherapy (CRT group: 35 patients) or surgery (CRT-S group: 12 patients). Clinical CR rate showed 86% in CRT group; and pathological CR rate 18% in CRT-S group. The overall median survival was 45 months, survival at 1, 3, 5 years being 96%, 52%, 48%, respectively. No treatment-related mortality was observed. The rate of the 'esophagus conservation' was 66%. Our results demonstrated that the multidisciplinary approach with intent to conserve the esophagus centering on high-dose radiotherapy and concurrent chemotherapy provides a significant improvement of both survival and quality of life in patients with operable esophageal carcinoma. (author)

  1. SHOVAV-JUEL. A one dimensional space-time kinetic code for pebble-bed high-temperature reactors with temperature and Xenon feedback

    International Nuclear Information System (INIS)

    Nabbi, R.; Meister, G.; Finken, R.; Haben, M.

    1982-09-01

    The present report describes the modelling basis and the structure of the neutron kinetics-code SHOVAV-Juel. Information for users is given regarding the application of the code and the generation of the input data. SHOVAV-Juel is a one-dimensional space-time-code based on a multigroup diffusion approach for four energy groups and six groups of delayed neutrons. It has been developed for the analysis of the transient behaviour of high temperature reactors with pebble-bed core. The reactor core is modelled by horizontal segments to which different materials compositions can be assigned. The temperature dependence of the reactivity is taken into account by using temperature dependent neutron cross sections. For the simulation of transients in an extended time range the time dependence of the reactivity absorption by Xenon-135 is taken into account. (orig./RW)

  2. The use of plane parallel ionization chambers in high energy electron and photon beams. An international code of practice for dosimetry

    International Nuclear Information System (INIS)

    1997-01-01

    Research on plane-parallel ionization chambers since the IAEA code of practice (TRS-277) was published in 1987 has explained our knowledge on perturbation and other correction factors in ionization chamber, and also constructional details of these chambers have been shown to be important. Different countries have published, or are in the process of publishing, dosimetry recommendations which include specific procedures for the use of plan parallel ionization chambers. An international working group was formed under the auspieces of the IAEA, first to review the status and the actual validity of the code of practice and second to develop an international code of practice of the use of plane parallel ionization chambers in high energy electron and photon beams used in radiotherapy. This document fulfills the second taste. 153 refs, 21 figs, 18 tabs

  3. User's manual for ASTERIX-2: A two-dimensional modular code system for the steady state and xenon transient analysis of a pebble bed high temperature reactor

    International Nuclear Information System (INIS)

    Wu, T.; Cowan, C.L.; Lauer, A.; Schwiegk, H.J.

    1982-03-01

    The ASTERIX modular code package was developed at KFA Laboratory-Juelich for the steady state and xenon transient analysis of a pebble bed high temperature reactor. The code package was implemented on the Stanford Linear Accelerator Center Computer in August, 1980, and a user's manual for the current version of the code, identified as ASTERIX-2, was prepared as a cooperative effort by KFA Laboratory and GE-ARSD. The material in the manual includes the requirements for accessing the program, a description of the major subroutines, a listing of the input options, and a listing of the input data for a sample problem. The material is provided in sufficient detail for the user to carry out a wide range of analysis from steady state operations to the xenon induced power transients in which the local xenon, temperature, buckling and control feedback effects have been incorporated in the problem solution. (orig.)

  4. New insights into the Lake Chad Basin population structure revealed by high-throughput genotyping of mitochondrial DNA coding SNPs.

    Directory of Open Access Journals (Sweden)

    María Cerezo

    Full Text Available BACKGROUND: Located in the Sudan belt, the Chad Basin forms a remarkable ecosystem, where several unique agricultural and pastoral techniques have been developed. Both from an archaeological and a genetic point of view, this region has been interpreted to be the center of a bidirectional corridor connecting West and East Africa, as well as a meeting point for populations coming from North Africa through the Saharan desert. METHODOLOGY/PRINCIPAL FINDINGS: Samples from twelve ethnic groups from the Chad Basin (n = 542 have been high-throughput genotyped for 230 coding region mitochondrial DNA (mtDNA Single Nucleotide Polymorphisms (mtSNPs using Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight (MALDI-TOF mass spectrometry. This set of mtSNPs allowed for much better phylogenetic resolution than previous studies of this geographic region, enabling new insights into its population history. Notable haplogroup (hg heterogeneity has been observed in the Chad Basin mirroring the different demographic histories of these ethnic groups. As estimated using a Bayesian framework, nomadic populations showed negative growth which was not always correlated to their estimated effective population sizes. Nomads also showed lower diversity values than sedentary groups. CONCLUSIONS/SIGNIFICANCE: Compared to sedentary population, nomads showed signals of stronger genetic drift occurring in their ancestral populations. These populations, however, retained more haplotype diversity in their hypervariable segments I (HVS-I, but not their mtSNPs, suggesting a more ancestral ethnogenesis. Whereas the nomadic population showed a higher Mediterranean influence signaled mainly by sub-lineages of M1, R0, U6, and U5, the other populations showed a more consistent sub-Saharan pattern. Although lifestyle may have an influence on diversity patterns and hg composition, analysis of molecular variance has not identified these differences. The present study indicates that

  5. Highly transparent poly(2-ethyl-2-oxazoline)-TiO2 nanocomposite coatings for the conservation of matte painted artworks

    OpenAIRE

    Colombo, A.; Gherardi, Francesca; Goidanich, S.; Delaney, J. K.; de la Rie, E. R.; Ubaldi, M. C.; Toniolo, L.; Simonutti, R.

    2015-01-01

    A nanocomposite coating based on TiO2 nanoparticles and poly(2-ethyl-2-oxazoline) is used as consolidant of matte painted surfaces (temperas, watercolors, modern paintings). The aim of this work is to provide advances in the conservation of these works of art, while preserving their optical appearance, in terms of colour and gloss. Fiber Optic Reflectance Spectroscopy (FORS) measurements of a painting-model (an acrylic black monochrome) treated with the nanocomposite coatings revealed that it...

  6. On the possibility of a place code for the low pitch of high-frequency complex tones

    DEFF Research Database (Denmark)

    Santurette, Sébastien; Dau, Torsten; Oxenham, Andrew J.

    2012-01-01

    on pitch matches, and (3) listeners’ ability to hear out the individual components. No effects of relative component phase or dichotic presentation on pitch matches were found in the tested conditions. Large individual differences were found in listeners’ ability to hear out individual components. Overall......, the results are consistent with the coding of individual harmonic frequencies, based on the tonotopic activity pattern or phase locking to individual harmonics, rather than with temporal coding of single-channel interactions. However, they are also consistent with more general temporal theories of pitch...

  7. DNA analysis indicates that Asian elephants are native to Borneo and are therefore a high priority for conservation.

    Directory of Open Access Journals (Sweden)

    Prithiviraj Fernando

    2003-10-01

    Full Text Available The origin of Borneo's elephants is controversial. Two competing hypotheses argue that they are either indigenous, tracing back to the Pleistocene, or were introduced, descending from elephants imported in the 16th-18th centuries. Taxonomically, they have either been classified as a unique subspecies or placed under the Indian or Sumatran subspecies. If shown to be a unique indigenous population, this would extend the natural species range of the Asian elephant by 1300 km, and therefore Borneo elephants would have much greater conservation importance than if they were a feral population. We compared DNA of Borneo elephants to that of elephants from across the range of the Asian elephant, using a fragment of mitochondrial DNA, including part of the hypervariable d-loop, and five autosomal microsatellite loci. We find that Borneo's elephants are genetically distinct, with molecular divergence indicative of a Pleistocene colonisation of Borneo and subsequent isolation. We reject the hypothesis that Borneo's elephants were introduced. The genetic divergence of Borneo elephants warrants their recognition as a separate evolutionary significant unit. Thus, interbreeding Borneo elephants with those from other populations would be contraindicated in ex situ conservation, and their genetic distinctiveness makes them one of the highest priority populations for Asian elephant conservation.

  8. Strategies for measuring evolutionary conservation of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Hofacker Ivo L

    2008-02-01

    Full Text Available Abstract Background Evolutionary conservation of RNA secondary structure is a typical feature of many functional non-coding RNAs. Since almost all of the available methods used for prediction and annotation of non-coding RNA genes rely on this evolutionary signature, accurate measures for structural conservation are essential. Results We systematically assessed the ability of various measures to detect conserved RNA structures in multiple sequence alignments. We tested three existing and eight novel strategies that are based on metrics of folding energies, metrics of single optimal structure predictions, and metrics of structure ensembles. We find that the folding energy based SCI score used in the RNAz program and a simple base-pair distance metric are by far the most accurate. The use of more complex metrics like for example tree editing does not improve performance. A variant of the SCI performed particularly well on highly conserved alignments and is thus a viable alternative when only little evolutionary information is available. Surprisingly, ensemble based methods that, in principle, could benefit from the additional information contained in sub-optimal structures, perform particularly poorly. As a general trend, we observed that methods that include a consensus structure prediction outperformed equivalent methods that only consider pairwise comparisons. Conclusion Structural conservation can be measured accurately with relatively simple and intuitive metrics. They have the potential to form the basis of future RNA gene finders, that face new challenges like finding lineage specific structures or detecting mis-aligned sequences.

  9. Updating of ASME Nuclear Code Case N-201 to Accommodate the Needs of Metallic Core Support Structures for High Temperature Gas Cooled Reactors Currently in Development

    International Nuclear Information System (INIS)

    Basol, Mit; Kielb, John F.; MuHooly, John F.; Smit, Kobus

    2007-01-01

    On September 29, 2005, ASME Standards Technology, LLC (ASME ST-LLC) executed a multi-year, cooperative agreement with the United States DOE for the Generation IV Reactor Materials project. The project's objective is to update and expand appropriate materials, construction, and design codes for application in future Generation IV nuclear reactor systems that operate at elevated temperatures. Task 4 was embarked upon in recognition of the large quantity of ongoing reactor designs utilizing high temperature technology. Since Code Case N-201 had not seen a significant revision (except for a minor revision in September, 2006 to change the SA-336 forging reference for 304SS and 316SS to SA-965 in Tables 1.2(a) and 1.2(b), and some minor editorial changes) since December 1994, identifying recommended updates to support the current high temperature Core Support Structure (CSS) designs and potential new designs was important. As anticipated, the Task 4 effort identified a number of Code Case N-201 issues. Items requiring further consideration range from addressing apparent inconsistencies in definitions and certain material properties between CC-N-201 and Subsection NH, to inclusion of additional materials to provide the designer more flexibility of design. Task 4 developed a design parameter survey that requested input from the CSS designers of ongoing high temperature gas cooled reactor metallic core support designs. The responses to the survey provided Task 4 valuable input to identify the design operating parameters and future needs of the CSS designers. Types of materials, metal temperature, time of exposure, design pressure, design life, and fluence levels were included in the Task 4 survey responses. The results of the survey are included in this report. This research proves that additional work must be done to update Code Case N-201. Task 4 activities provide the framework for the Code Case N-201 update and future work to provide input on materials. Candidate

  10. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  11. The selective conservative management of small traumatic pneumothoraces following stab injuries is safe: experience from a high-volume trauma service in South Africa.

    Science.gov (United States)

    Kong, V Y; Oosthuizen, G V; Clarke, D L

    2015-02-01

    The selective conservative management of small pneumothoraces (PTXs) following stab injuries is controversial. We reviewed a cohort of patients managed conservatively in a high volume trauma service in South Africa. A retrospective review over a 2-year period identified 125 asymptomatic patients with small PTXs measuring chest radiograph who were managed conservatively. Of the 125 patients included in the study, 92% were male (115/125), and the median age for all patients was 21 years (19-24). Ninety-seven per cent (121/125) of the weapons involved were knives, and 3% (4/125) were screwdrivers. Sixty-one per cent of all injuries were on the left side. Eighty-two per cent (102/125) sustained a single stab, and 18% (23/125) had multiple stabs. Thirty-nine per cent (49/125) had a PTX <0.5 cm (Group A), 26% (32/125) were ≥ 0.5 to <1 cm (Group B), 19% (24/125) were ≥ 1 to <1.5 cm (Group C) and 15% (20/125) were ≥ 1.5 to <2 cm (Group D). Three per cent of all patients (4/125) eventually required ICDs (one in Group C, three in Group D). All four patients had ICDs in situ for 24 h. The remaining 97% (121/125) were all managed successfully by active clinical observation alone. There were no subsequent readmissions, morbidity or mortality as a direct result of our conservative approach. The selective conservative management of asymptomatic small PTXs from stab injuries is safe if undertaken in the appropriate setting.

  12. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  13. [Conservation Units.

    Science.gov (United States)

    Texas Education Agency, Austin.

    Each of the six instructional units deals with one aspect of conservation: forests, water, rangeland, minerals (petroleum), and soil. The area of the elementary school curriculum with which each correlates is indicated. Lists of general and specific objectives are followed by suggested teaching procedures, including ideas for introducing the…

  14. Creative conservation

    NARCIS (Netherlands)

    Bentham, Roelof J.

    1968-01-01

    The increasing exploitation of our natural resources, the unlimited occupation of ever more new areas, and the intensification of land-use, make it necessary for us to expand the concept of conservation. But we also need to reconsider that concept itself. For the changing conditions in the

  15. Reshaping conservation

    DEFF Research Database (Denmark)

    Funder, Mikkel; Danielsen, Finn; Ngaga, Yonika

    2013-01-01

    members strengthen the monitoring practices to their advantage, and to some extent move them beyond the reach of government agencies and conservation and development practitioners. This has led to outcomes that are of greater social and strategic value to communities than the original 'planned' benefits...

  16. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  17. Global Ionosphere Mapping and Differential Code Bias Estimation during Low and High Solar Activity Periods with GIMAS Software

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2018-05-01

    Full Text Available Ionosphere research using the Global Navigation Satellite Systems (GNSS techniques is a hot topic, with their unprecedented high temporal and spatial sampling rate. We introduced a new GNSS Ionosphere Monitoring and Analysis Software (GIMAS in order to model the global ionosphere vertical total electron content (VTEC maps and to estimate the GPS and GLObalnaya NAvigatsionnaya Sputnikovaya Sistema (GLONASS satellite and receiver differential code biases (DCBs. The GIMAS-based Global Ionosphere Map (GIM products during low (day of year from 202 to 231, in 2008 and high (day of year from 050 to 079, in 2014 solar activity periods were investigated and assessed. The results showed that the biases of the GIMAS-based VTEC maps relative to the International GNSS Service (IGS Ionosphere Associate Analysis Centers (IAACs VTEC maps ranged from −3.0 to 1.0 TECU (TEC unit (1 TECU = 1 × 1016 electrons/m2. The standard deviations (STDs ranged from 0.7 to 1.9 TECU in 2008, and from 2.0 to 8.0 TECU in 2014. The STDs at a low latitude were significantly larger than those at middle and high latitudes, as a result of the ionospheric latitudinal gradients. When compared with the Jason-2 VTEC measurements, the GIMAS-based VTEC maps showed a negative systematic bias of about −1.8 TECU in 2008, and a positive systematic bias of about +2.2 TECU in 2014. The STDs were about 2.0 TECU in 2008, and ranged from 2.2 to 8.5 TECU in 2014. Furthermore, the aforementioned characteristics were strongly related to the conditions of the ionosphere variation and the geographic latitude. The GPS and GLONASS satellite and receiver P1-P2 DCBs were compared with the IAACs DCBs. The root mean squares (RMSs were 0.16–0.20 ns in 2008 and 0.13–0.25 ns in 2014 for the GPS satellites and 0.26–0.31 ns in 2014 for the GLONASS satellites. The RMSs of receiver DCBs were 0.21–0.42 ns in 2008 and 0.33–1.47 ns in 2014 for GPS and 0.67–0.96 ns in 2014 for GLONASS. The monthly

  18. Restricted-range fishes and the conservation of Brazilian freshwaters.

    Directory of Open Access Journals (Sweden)

    Cristiano Nogueira

    Full Text Available BACKGROUND: Freshwaters are the most threatened ecosystems on earth. Although recent assessments provide data on global priority regions for freshwater conservation, local scale priorities remain unknown. Refining the scale of global biodiversity assessments (both at terrestrial and freshwater realms and translating these into conservation priorities on the ground remains a major challenge to biodiversity science, and depends directly on species occurrence data of high taxonomic and geographic resolution. Brazil harbors the richest freshwater ichthyofauna in the world, but knowledge on endemic areas and conservation in Brazilian rivers is still scarce. METHODOLOGY/PRINCIPAL FINDINGS: Using data on environmental threats and revised species distribution data we detect and delineate 540 small watershed areas harboring 819 restricted-range fishes in Brazil. Many of these areas are already highly threatened, as 159 (29% watersheds have lost more than 70% of their original vegetation cover, and only 141 (26% show significant overlap with formally protected areas or indigenous lands. We detected 220 (40% critical watersheds overlapping hydroelectric dams or showing both poor formal protection and widespread habitat loss; these sites harbor 344 endemic fish species that may face extinction if no conservation action is in place in the near future. CONCLUSIONS/SIGNIFICANCE: We provide the first analysis of site-scale conservation priorities in the richest freshwater ecosystems of the globe. Our results corroborate the hypothesis that freshwater biodiversity has been neglected in former conservation assessments. The study provides a simple and straightforward method for detecting freshwater priority areas based on endemism and threat, and represents a starting point for integrating freshwater and terrestrial conservation in representative and biogeographically consistent site-scale conservation strategies, that may be scaled-up following naturally linked

  19. Restricted-range fishes and the conservation of Brazilian freshwaters.

    Science.gov (United States)

    Nogueira, Cristiano; Buckup, Paulo A; Menezes, Naercio A; Oyakawa, Osvaldo T; Kasecker, Thais P; Ramos Neto, Mario B; da Silva, José Maria C

    2010-06-30

    Freshwaters are the most threatened ecosystems on earth. Although recent assessments provide data on global priority regions for freshwater conservation, local scale priorities remain unknown. Refining the scale of global biodiversity assessments (both at terrestrial and freshwater realms) and translating these into conservation priorities on the ground remains a major challenge to biodiversity science, and depends directly on species occurrence data of high taxonomic and geographic resolution. Brazil harbors the richest freshwater ichthyofauna in the world, but knowledge on endemic areas and conservation in Brazilian rivers is still scarce. Using data on environmental threats and revised species distribution data we detect and delineate 540 small watershed areas harboring 819 restricted-range fishes in Brazil. Many of these areas are already highly threatened, as 159 (29%) watersheds have lost more than 70% of their original vegetation cover, and only 141 (26%) show significant overlap with formally protected areas or indigenous lands. We detected 220 (40%) critical watersheds overlapping hydroelectric dams or showing both poor formal protection and widespread habitat loss; these sites harbor 344 endemic fish species that may face extinction if no conservation action is in place in the near future. We provide the first analysis of site-scale conservation priorities in the richest freshwater ecosystems of the globe. Our results corroborate the hypothesis that freshwater biodiversity has been neglected in former conservation assessments. The study provides a simple and straightforward method for detecting freshwater priority areas based on endemism and threat, and represents a starting point for integrating freshwater and terrestrial conservation in representative and biogeographically consistent site-scale conservation strategies, that may be scaled-up following naturally linked drainage systems. Proper management (e. g. forestry code enforcement, landscape

  20. Integrating conservation costs into sea level rise adaptive conservation prioritization

    Directory of Open Access Journals (Sweden)

    Mingjian Zhu

    2015-07-01

    Full Text Available Biodiversity conservation requires strategic investment as resources for conservation are often limited. As sea level rises, it is important and necessary to consider both sea level rise and costs in conservation decision making. In this study, we consider costs of conservation in an integrated modeling process that incorporates a geomorphological model (SLAMM, species habitat models, and conservation prioritization (Zonation to identify conservation priorities in the face of landscape dynamics due to sea level rise in the Matanzas River basin of northeast Florida. Compared to conservation priorities that do not consider land costs in the analysis process, conservation priorities that consider costs in the planning process change significantly. The comparison demonstrates that some areas with high conservation values might be identified as lower priorities when integrating economic costs in the planning process and some areas with low conservation values might be identified as high priorities when considering costs in the planning process. This research could help coastal resources managers make informed decisions about where and how to allocate conservation resources more wisely to facilitate biodiversity adaptation to sea level rise.