WorldWideScience

Sample records for highly conserved coding

  1. Highly conserved non-coding sequences are associated with vertebrate development.

    Directory of Open Access Journals (Sweden)

    Adam Woolfe

    2005-01-01

    Full Text Available In addition to protein coding sequence, the human genome contains a significant amount of regulatory DNA, the identification of which is proving somewhat recalcitrant to both in silico and functional methods. An approach that has been used with some success is comparative sequence analysis, whereby equivalent genomic regions from different organisms are compared in order to identify both similarities and differences. In general, similarities in sequence between highly divergent organisms imply functional constraint. We have used a whole-genome comparison between humans and the pufferfish, Fugu rubripes, to identify nearly 1,400 highly conserved non-coding sequences. Given the evolutionary divergence between these species, it is likely that these sequences are found in, and furthermore are essential to, all vertebrates. Most, and possibly all, of these sequences are located in and around genes that act as developmental regulators. Some of these sequences are over 90% identical across more than 500 bases, being more highly conserved than coding sequence between these two species. Despite this, we cannot find any similar sequences in invertebrate genomes. In order to begin to functionally test this set of sequences, we have used a rapid in vivo assay system using zebrafish embryos that allows tissue-specific enhancer activity to be identified. Functional data is presented for highly conserved non-coding sequences associated with four unrelated developmental regulators (SOX21, PAX6, HLXB9, and SHH, in order to demonstrate the suitability of this screen to a wide range of genes and expression patterns. Of 25 sequence elements tested around these four genes, 23 show significant enhancer activity in one or more tissues. We have identified a set of non-coding sequences that are highly conserved throughout vertebrates. They are found in clusters across the human genome, principally around genes that are implicated in the regulation of development

  2. Dissecting the transcriptional regulatory properties of human chromosome 16 highly conserved non-coding regions.

    Directory of Open Access Journals (Sweden)

    José Luis Royo

    Full Text Available Non-coding DNA conservation across species has been often used as a predictor for transcriptional enhancer activity. However, only a few systematic analyses of the function of these highly conserved non-coding regions (HCNRs have been performed. Here we use zebrafish transgenic assays to perform a systematic study of 113 HCNRs from human chromosome 16. By comparing transient and stable transgenesis, we show that the first method is highly inefficient, leading to 40% of false positives and 20% of false negatives. When analyzed in stable transgenic lines, a great majority of HCNRs were active in the central nervous system, although some of them drove expression in other organs such as the eye and the excretory system. Finally, by testing a fraction of the HCNRs lacking enhancer activity for in vivo insulator activity, we find that 20% of them may contain enhancer-blocking function. Altogether our data indicate that HCNRs may contain different types of cis-regulatory activity, including enhancer, insulators as well as other not yet discovered functions.

  3. Accurate discrimination of conserved coding and non-coding regions through multiple indicators of evolutionary dynamics

    Directory of Open Access Journals (Sweden)

    Pesole Graziano

    2009-09-01

    Full Text Available Abstract Background The conservation of sequences between related genomes has long been recognised as an indication of functional significance and recognition of sequence homology is one of the principal approaches used in the annotation of newly sequenced genomes. In the context of recent findings that the number non-coding transcripts in higher organisms is likely to be much higher than previously imagined, discrimination between conserved coding and non-coding sequences is a topic of considerable interest. Additionally, it should be considered desirable to discriminate between coding and non-coding conserved sequences without recourse to the use of sequence similarity searches of protein databases as such approaches exclude the identification of novel conserved proteins without characterized homologs and may be influenced by the presence in databases of sequences which are erroneously annotated as coding. Results Here we present a machine learning-based approach for the discrimination of conserved coding sequences. Our method calculates various statistics related to the evolutionary dynamics of two aligned sequences. These features are considered by a Support Vector Machine which designates the alignment coding or non-coding with an associated probability score. Conclusion We show that our approach is both sensitive and accurate with respect to comparable methods and illustrate several situations in which it may be applied, including the identification of conserved coding regions in genome sequences and the discrimination of coding from non-coding cDNA sequences.

  4. Conservation of concrete structures in fib model code 2010

    NARCIS (Netherlands)

    Matthews, S.L.; Ueda, T.; Bigaj-van Vliet, A.

    2012-01-01

    Chapter 9: Conservation of concrete structures forms part of fib Model Code 2010, the first draft of which was published for comment as fib Bulletins 55 and 56 (fib 2010). Numerous comments were received and considered by fib Special Activity Group 5 responsible for the preparation of fib Model Code

  5. Conservation of concrete structures in fib model code 2010

    NARCIS (Netherlands)

    Matthews, S.L.; Ueda, T.; Bigaj-van Vliet, A.

    2012-01-01

    Chapter 9: Conservation of concrete structures forms part of fib Model Code 2010, the first draft of which was published for comment as fib Bulletins 55 and 56 (fib 2010). Numerous comments were received and considered by fib Special Activity Group 5 responsible for the preparation of fib Model Code

  6. Model code for energy conservation in new building construction

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    In response to the recognized lack of existing consensus standards directed to the conservation of energy in building design and operation, the preparation and publication of such a standard was accomplished with the issuance of ASHRAE Standard 90-75 ''Energy Conservation in New Building Design,'' by the American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc., in 1975. This standard addressed itself to recommended practices for energy conservation, using both depletable and non-depletable sources. A model code for energy conservation in building construction has been developed, setting forth the minimum regulations found necessary to mandate such conservation. The code addresses itself to the administration, design criteria, systems elements, controls, service water heating and electrical distribution and use, both for depletable and non-depletable energy sources. The technical provisions of the document are based on ASHRAE 90-75 and it is intended for use by state and local building officials in the implementation of a statewide energy conservation program.

  7. High performance scalable image coding

    Institute of Scientific and Technical Information of China (English)

    Gan Tao; He Yanmin; Zhu Weile

    2007-01-01

    A high performance scalable image coding algorithm is proposed. The salient features of this algorithm are the ways to form and locate the significant clusters. Thanks to the list structure, the new coding algorithm achieves fine fractional bit-plane coding with negligible additional complexity. Experiments show that it performs comparably or better than the state-of-the-art coders. Furthermore, the flexible codec supports both quality and resolution scalability, which is very attractive in many network applications.

  8. A Very Fast and Momentum-Conserving Tree Code

    CERN Document Server

    Dehnen, W

    2000-01-01

    The tree code for the approximate evaluation of gravitational forces is extended and substantially accelerated by including mutual cell-cell interactions. These are computed by a Taylor series in Cartesian coordinates and in a completely symmetric fashion, such that Newton's third law is satisfied by construction and hence momentum exactly conserved. The computational effort is further reduced by exploiting the mutual symmetry of the interactions. For typical astrophysical problems with N=10^5 and at the same level of accuracy, the new code is about four times faster than the tree code. For large N, the computational costs are found to scale almost linearly with N, which can also be supported by a theoretical argument, and the advantage over the tree code increases with ever larger N.

  9. Variation in conserved non-coding sequences on chromosome 5q andsusceptibility to asthma and atopy

    Energy Technology Data Exchange (ETDEWEB)

    Donfack, Joseph; Schneider, Daniel H.; Tan, Zheng; Kurz,Thorsten; Dubchak, Inna; Frazer, Kelly A.; Ober, Carole

    2005-09-10

    Background: Evolutionarily conserved sequences likely havebiological function. Methods: To determine whether variation in conservedsequences in non-coding DNA contributes to risk for human disease, westudied six conserved non-coding elements in the Th2 cytokine cluster onhuman chromosome 5q31 in a large Hutterite pedigree and in samples ofoutbred European American and African American asthma cases and controls.Results: Among six conserved non-coding elements (>100 bp,>70percent identity; human-mouse comparison), we identified one singlenucleotide polymorphism (SNP) in each of two conserved elements and sixSNPs in the flanking regions of three conserved elements. We genotypedour samples for four of these SNPs and an additional three SNPs each inthe IL13 and IL4 genes. While there was only modest evidence forassociation with single SNPs in the Hutterite and European Americansamples (P<0.05), there were highly significant associations inEuropean Americans between asthma and haplotypes comprised of SNPs in theIL4 gene (P<0.001), including a SNP in a conserved non-codingelement. Furthermore, variation in the IL13 gene was strongly associatedwith total IgE (P = 0.00022) and allergic sensitization to mold allergens(P = 0.00076) in the Hutterites, and more modestly associated withsensitization to molds in the European Americans and African Americans (P<0.01). Conclusion: These results indicate that there is overalllittle variation in the conserved non-coding elements on 5q31, butvariation in IL4 and IL13, including possibly one SNP in a conservedelement, influence asthma and atopic phenotypes in diversepopulations.

  10. Development of Momentum Conserving Monte Carlo Simulation Code for ECCD Study in Helical Plasmas

    Directory of Open Access Journals (Sweden)

    Murakami S.

    2015-01-01

    Full Text Available Parallel momentum conserving collision model is developed for GNET code, in which a linearized drift kinetic equation is solved in the five dimensional phase-space to study the electron cyclotron current drive (ECCD in helical plasmas. In order to conserve the parallel momentum, we introduce a field particle collision term in addition to the test particle collision term. Two types of the field particle collision term are considered. One is the high speed limit model, where the momentum conserving term does not depend on the velocity of the background plasma and can be expressed in a simple form. The other is the velocity dependent model, which is derived from the Fokker–Planck collision term directly. In the velocity dependent model the field particle operator can be expressed using Legendre polynominals and, introducing the Rosenbluth potential, we derive the field particle term for each Legendre polynominals. In the GNET code, we introduce an iterative process to implement the momentum conserving collision operator. The high speed limit model is applied to the ECCD simulation of the heliotron-J plasma. The simulation results show a good conservation of the momentum with the iterative scheme.

  11. Development of Momentum Conserving Monte Carlo Simulation Code for ECCD Study in Helical Plasmas

    Science.gov (United States)

    Murakami, S.; Hasegawa, S.; Moriya, Y.

    2015-03-01

    Parallel momentum conserving collision model is developed for GNET code, in which a linearized drift kinetic equation is solved in the five dimensional phase-space to study the electron cyclotron current drive (ECCD) in helical plasmas. In order to conserve the parallel momentum, we introduce a field particle collision term in addition to the test particle collision term. Two types of the field particle collision term are considered. One is the high speed limit model, where the momentum conserving term does not depend on the velocity of the background plasma and can be expressed in a simple form. The other is the velocity dependent model, which is derived from the Fokker-Planck collision term directly. In the velocity dependent model the field particle operator can be expressed using Legendre polynominals and, introducing the Rosenbluth potential, we derive the field particle term for each Legendre polynominals. In the GNET code, we introduce an iterative process to implement the momentum conserving collision operator. The high speed limit model is applied to the ECCD simulation of the heliotron-J plasma. The simulation results show a good conservation of the momentum with the iterative scheme.

  12. 78 FR 33838 - DOE Participation in Development of the International Energy Conservation Code

    Science.gov (United States)

    2013-06-05

    ... Code AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Notice and... by the International Code Council (ICC) to develop the International Energy Conservation Code (IECC... on actions taken on DOE's code change proposals and technical analysis at the ICC Committee...

  13. 77 FR 74167 - Information Collection Request: Highly Erodible Land Conservation and Wetland Conservation

    Science.gov (United States)

    2012-12-13

    ... Farm Service Agency Information Collection Request: Highly Erodible Land Conservation and Wetland... associated with Highly Erodible Land Conservation and Wetland Conservation certification requirements. This.... SUPPLEMENTARY INFORMATION: Title: Highly Erodible Land Conservation and Wetland Conservation Certification....

  14. Combinatorial polarization, code loops, and codes of high level

    Directory of Open Access Journals (Sweden)

    Petr Vojtěchovský

    2004-07-01

    Full Text Available We first find the combinatorial degree of any map f:V→F, where F is a finite field and V is a finite-dimensional vector space over F. We then simplify and generalize a certain construction, due to Chein and Goodaire, that was used in characterizing code loops as finite Moufang loops that possess at most two squares. The construction yields binary codes of high divisibility level with prescribed Hamming weights of intersections of codewords.

  15. CSTminer: a web tool for the identification of coding and noncoding conserved sequence tags through cross-species genome comparison.

    Science.gov (United States)

    Castrignanò, Tiziana; Canali, Alessandro; Grillo, Giorgio; Liuni, Sabino; Mignone, Flavio; Pesole, Graziano

    2004-07-01

    The identification and characterization of genome tracts that are highly conserved across species during evolution may contribute significantly to the functional annotation of whole-genome sequences. Indeed, such sequences are likely to correspond to known or unknown coding exons or regulatory motifs. Here, we present a web server implementing a previously developed algorithm that, by comparing user-submitted genome sequences, is able to identify statistically significant conserved blocks and assess their coding or noncoding nature through the measure of a coding potential score. The web tool, available at http://www.caspur.it/CSTminer/, is dynamically interconnected with the Ensembl genome resources and produces a graphical output showing a map of detected conserved sequences and annotated gene features.

  16. Discussion on the Energy Conservation across a Sharp Gradient Junction in SPACE-CAP Code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon Joon; Choo, Yeon Joon; Hwang, Su Hyun; Kim, Min Gi; Lee, Byung Chul [Seoul National University, Seoul (Korea, Republic of)

    2011-05-15

    SPACE code for RCS (Reactor Coolant System) analysis and CAP code for containment analysis are now under V and V (Validation and Verification). CAP code has undergone or will undergo so many test problems for following categories: 1) Fundamental phenomena. 2) Principle phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. CAP V and V is now in the category 3 above. Most important demand for CAP code at this time is the capability of containment pressure and temperature analysis. Thus, the V and V for thermodynamics problems and energy conservation is extremely important. Energy conservation should be at times carefully examined in case of sharp gradient across a junction when the form of energy equation is based on the specific internal energy. This paper discusses on the energy conservation across a sharp gradient junction

  17. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  18. Conservation of the Exon-Intron Structure of Long Intergenic Non-Coding RNA Genes in Eutherian Mammals

    Directory of Open Access Journals (Sweden)

    Diana Chernikova

    2016-07-01

    Full Text Available The abundance of mammalian long intergenic non-coding RNA (lincRNA genes is high, yet their functions remain largely unknown. One possible way to study this important question is to use large-scale comparisons of various characteristics of lincRNA with those of protein-coding genes for which a large body of functional information is available. A prominent feature of mammalian protein-coding genes is the high evolutionary conservation of the exon-intron structure. Comparative analysis of putative intron positions in lincRNA genes from various mammalian genomes suggests that some lincRNA introns have been conserved for over 100 million years, thus the primary and/or secondary structure of these molecules is likely to be functionally important.

  19. Genetic evidence for conserved non-coding element function across species--the ears have it

    Directory of Open Access Journals (Sweden)

    Eric E Turner

    2014-01-01

    Full Text Available Comparison of genomic sequences from diverse vertebrate species has revealed numerous highly conserved regions that do not appear to encode proteins or functional RNAs. Often these conserved non-coding elements, or CNEs, direct gene expression to specific tissues in transgenic models, demonstrating they have regulatory function. CNEs are frequently found near ‘developmental’ genes, particularly transcription factors, implying that these elements have essential regulatory roles in development. However, actual examples demonstrating CNE regulatory functions across species have been few, and recent loss-of-function studies of several CNEs in mice have shown relatively minor effects. In this Perspectives article, we discuss new findings in fancy rats and Highland cattle demonstrating that function of a CNE near the Hmx1 gene is crucial for normal external ear development and resembles loss-of function Hmx1 coding mutations in mice and humans. These findings provide important support for similar developmental roles of CNEs in divergent species, and reinforce the concept that CNEs should be examined systematically in the ongoing search for genetic causes of human developmental disorders in the era of genome-scale sequencing.

  20. The Histone Code of Toxoplasma gondii Comprises Conserved and Unique Posttranslational Modifications

    Science.gov (United States)

    Nardelli, Sheila C.; Che, Fa-Yun; Silmon de Monerri, Natalie C.; Xiao, Hui; Nieves, Edward; Madrid-Aliste, Carlos; Angel, Sergio O.; Sullivan, William J.; Angeletti, Ruth H.; Kim, Kami; Weiss, Louis M.

    2013-01-01

    ABSTRACT Epigenetic gene regulation has emerged as a major mechanism for gene regulation in all eukaryotes. Histones are small, basic proteins that constitute the major protein component of chromatin, and posttranslational modifications (PTM) of histones are essential for epigenetic gene regulation. The different combinations of histone PTM form the histone code for an organism, marking functional units of chromatin that recruit macromolecular complexes that govern chromatin structure and regulate gene expression. To characterize the repertoire of Toxoplasma gondii histone PTM, we enriched histones using standard acid extraction protocols and analyzed them with several complementary middle-down and bottom-up proteomic approaches with the high-resolution Orbitrap mass spectrometer using collision-induced dissociation (CID), higher-energy collisional dissociation (HCD), and/or electron transfer dissociation (ETD) fragmentation. We identified 249 peptides with unique combinations of PTM that comprise the T. gondii histone code. T. gondii histones share a high degree of sequence conservation with human histones, and many modifications are conserved between these species. In addition, T. gondii histones have unique modifications not previously identified in other species. Finally, T. gondii histones are modified by succinylation, propionylation, and formylation, recently described histone PTM that have not previously been identified in parasitic protozoa. The characterization of the T. gondii histone code will facilitate in-depth analysis of how epigenetic regulation affects gene expression in pathogenic apicomplexan parasites and identify a new model system for elucidating the biological functions of novel histone PTM. PMID:24327343

  1. Divergence of conserved non-coding sequences: rate estimates and relative rate tests.

    Science.gov (United States)

    Wagner, Günter P; Fried, Claudia; Prohaska, Sonja J; Stadler, Peter F

    2004-11-01

    In many eukaryotic genomes only a small fraction of the DNA codes for proteins, but the non-protein coding DNA harbors important genetic elements directing the development and the physiology of the organisms, like promoters, enhancers, insulators, and micro-RNA genes. The molecular evolution of these genetic elements is difficult to study because their functional significance is hard to deduce from sequence information alone. Here we propose an approach to the study of the rate of evolution of functional non-coding sequences at a macro-evolutionary scale. We identify functionally important non-coding sequences as Conserved Non-Coding Nucleotide (CNCN) sequences from the comparison of two outgroup species. The CNCN sequences so identified are then compared to their homologous sequences in a pair of ingroup species, and we monitor the degree of modification these sequences suffered in the two ingroup lineages. We propose a method to test for rate differences in the modification of CNCN sequences among the two ingroup lineages, as well as a method to estimate their rate of modification. We apply this method to the full sequences of the HoxA clusters from six gnathostome species: a shark, Heterodontus francisci; a basal ray finned fish, Polypterus senegalus; the amphibian, Xenopus tropicalis; as well as three mammalian species, human, rat and mouse. The results show that the evolutionary rate of CNCN sequences is not distinguishable among the three mammalian lineages, while the Xenopus lineage has a significantly increased rate of evolution. Furthermore the estimates of the rate parameters suggest that in the stem lineage of mammals the rate of CNCN sequence evolution was more than twice the rate observed within the placental amniotes clade, suggesting a high rate of evolution of cis-regulatory elements during the origin of amniotes and mammals. We conclude that the proposed methods can be used for testing hypotheses about the rate and pattern of evolution of putative

  2. Guide to the Changes between the 2009 and 2012 International Energy Conservation Code

    Energy Technology Data Exchange (ETDEWEB)

    Mapes, Terry S.; Conover, David R.

    2012-05-31

    The International Code Council (ICC) published the 2012 International Energy Conservation Code{reg_sign} (IECC) in early 2012. The 2012 IECC is based on revisions, additions, and deletions to the 2009 IECC that were considered during the ICC code development process conducted in 2011. Solid vertical lines, arrows, or asterisks printed in the 2012 IECC indicate where revisions, deletions, or relocations of text respectively were made to 2009 IECC. Although these marginal markings indicate where changes have been made to the code, they do not provide any further guidance, leaving the reader to consult and compare the 2009 and 2012 IECC for more detail.

  3. Does high harmonic generation conserve angular momentum?

    CERN Document Server

    Fleischer, Avner; Diskin, Tzvi; Sidorenko, Pavel; Cohen, Oren

    2013-01-01

    High harmonic generation (HHG) is a unique and useful process in which infrared or visible radiation is frequency up converted into the extreme ultraviolet and x ray spectral regions. As a parametric process, high harmonic generation should conserve the radiation energy, momentum and angular momentum. Indeed, conservation of energy and momentum have been demonstrated. Angular momentum of optical beams can be divided into two components: orbital and spin (polarization). Orbital angular momentum is assumed to be conserved and recently observed deviations were attributed to propagation effects. On the other hand, conservation of spin angular momentum has thus far never been studied, neither experimentally nor theoretically. Here, we present the first study on the role of spin angular momentum in extreme nonlinear optics by experimentally generating high harmonics of bi chromatic elliptically polarized pump beams that interact with isotropic media. While observing that the selection rules qualitatively correspond...

  4. Exploring function of conserved non-coding DNA in its chromosomal context

    Directory of Open Access Journals (Sweden)

    Delores J. Grant

    2015-11-01

    Full Text Available There is renewed interest in understanding expression of vertebrate genes in their chromosomal context because regulatory sequences that confer tissue-specific expression are often distributed over large distances along the DNA from the gene. One approach inserts a universal sensor/reporter-gene into the mouse or zebrafish genome to identify regulatory sequences in highly conserved non-coding DNA in the vicinity of the integrated reporter-gene. However detailed mechanisms of interaction of these regulatory elements among themselves and/or with the genes they influence remain elusive with the strategy. The inability to associate distant regulatory elements with the genes they regulate makes it difficult to examine the contribution of sequence changes in regulatory DNA to human disease. Such associations have been obtained in favorable circumstances by testing the regulatory potential of highly conserved non-coding DNA individually in small reporter-gene-containing plasmids. Alternative approaches use tiny fragments of chromosomes in Bacterial Artificial Chromosomes, BACs, where the gene of interest is tagged in vitro with a reporter/sensor gene and integrated into the germ-line of animals for expression. Mutational analysis of the BAC DNA identifies regulatory sequences. A recent approach inserts a sensor/reporter-gene into a BAC that is also truncated progressively from an end of genomic insert, and the end-deleted BAC carrying the sensor is then integrated into the genome of a developing animal for expression. The approach allows mechanisms of tissue-specific gene expression to be explored in much greater detail, although the chromosomal context of such mechanisms is limited to the length of the BAC. Here we discuss the relative strengths of the various approaches and explore how the integrated-sensor in the BACs method applied to a contig of BACs spanning a chromosomal region is likely to address mechanistic questions on interactions between

  5. Energy Efficiency Pilot Projects in Jaipur: Testing the Energy Conservation Building Code

    Energy Technology Data Exchange (ETDEWEB)

    Evans, Meredydd; Mathur, Jyotirmay; Yu, Sha

    2014-03-26

    The Malaviya National Institute of Technology (MNIT) in Jaipur, India is constructing two new buildings on its campus that allow it to test implementation of the Energy Conservation Building Code (ECBC), which Rajasthan made mandatory in 2011. PNNL has been working with MNIT to document progress on ECBC implementation in these buildings.

  6. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in...

  7. Forest conservation delivers highly variable coral reef conservation outcomes.

    Science.gov (United States)

    Klein, Carissa J; Jupiter, Stacy D; Selig, Elizabeth R; Watts, Matthew E; Halpern, Benjamin S; Kamal, Muhammad; Roelfsema, Chris; Possingham, Hugh P

    2012-06-01

    Coral reefs are threatened by human activities on both the land (e.g., deforestation) and the sea (e.g., overfishing). Most conservation planning for coral reefs focuses on removing threats in the sea, neglecting management actions on the land. A more integrated approach to coral reef conservation, inclusive of land-sea connections, requires an understanding of how and where terrestrial conservation actions influence reefs. We address this by developing a land-sea planning approach to inform fine-scale spatial management decisions and test it in Fiji. Our aim is to determine where the protection of forest can deliver the greatest return on investment for coral reef ecosystems. To assess the benefits of conservation to coral reefs, we estimate their relative condition as influenced by watershed-based pollution and fishing. We calculate the cost-effectiveness of protecting forest and find that investments deliver rapidly diminishing returns for improvements to relative reef condition. For example, protecting 2% of forest in one area is almost 500 times more beneficial than protecting 2% in another area, making prioritization essential. For the scenarios evaluated, relative coral reef condition could be improved by 8-58% if all remnant forest in Fiji were protected rather than deforested. Finally, we determine the priority of each coral reef for implementing a marine protected area when all remnant forest is protected for conservation. The general results will support decisions made by the Fiji Protected Area Committee as they establish a national protected area network that aims to protect 20% of the land and 30% of the inshore waters by 2020. Although challenges remain, we can inform conservation decisions around the globe by tackling the complex issues relevant to integrated land-sea planning.

  8. Redefining Secondary Forests in the Mexican Forest Code: Implications for Management, Restoration, and Conservation

    Directory of Open Access Journals (Sweden)

    Francisco J. Román-Dañobeytia

    2014-05-01

    Full Text Available The Mexican Forest Code establishes structural reference values to differentiate between secondary and old-growth forests and requires a management plan when secondary forests become old-growth and potentially harvestable forests. The implications of this regulation for forest management, restoration, and conservation were assessed in the context of the Calakmul Biosphere Reserve, which is located in the Yucatan Peninsula. The basal area and stem density thresholds currently used by the legislation to differentiate old-growth from secondary forests are 4 m2/ha and 15 trees/ha (trees with a diameter at breast height of >25 cm; however, our research indicates that these values should be increased to 20 m2/ha and 100 trees/ha, respectively. Given that a management plan is required when secondary forests become old-growth forests, many landowners avoid forest-stand development by engaging slash-and-burn agriculture or cattle grazing. We present evidence that deforestation and land degradation may prevent the natural regeneration of late-successional tree species of high ecological and economic importance. Moreover, we discuss the results of this study in the light of an ongoing debate in the Yucatan Peninsula between policy makers, non-governmental organizations (NGOs, landowners and researchers, regarding the modification of this regulation to redefine the concept of acahual (secondary forest and to facilitate forest management and restoration with valuable timber tree species.

  9. Improving conservation properties of a 5D gyrokinetic semi-Lagrangian code

    Science.gov (United States)

    Latu, Guillaume; Grandgirard, Virginie; Abiteboul, Jérémie; Crouseilles, Nicolas; Dif-Pradalier, Guilhem; Garbet, Xavier; Ghendrih, Philippe; Mehrenberger, Michel; Sarazin, Yanick; Sonnendrücker, Eric

    2014-11-01

    In gyrokinetic turbulent simulations, the knowledge of some stationary states can help reducing numerical artifacts. Considering long-term simulations, the qualities of the Vlasov solver and of the radial boundary conditions have an impact on the conservation properties. In order to improve mass and energy conservation mainly, the following methods are investigated: fix the radial boundary conditions on a stationary state, use a 4D advection operator that avoids a directional splitting, interpolate with a delta-f approach. The combination of these techniques in the semi-Lagrangian code gysela leads to a net improvement of the conservation properties in 5D simulations. Contribution to the Topical Issue "Theory and Applications of the Vlasov Equation", edited by Francesco Pegoraro, Francesco Califano, Giovanni Manfredi and Philip J. Morrison.

  10. Identification of evolutionarily conserved non-AUG-initiated N-terminal extensions in human coding sequences.

    LENUS (Irish Health Repository)

    Ivanov, Ivaylo P

    2011-05-01

    In eukaryotes, it is generally assumed that translation initiation occurs at the AUG codon closest to the messenger RNA 5\\' cap. However, in certain cases, initiation can occur at codons differing from AUG by a single nucleotide, especially the codons CUG, UUG, GUG, ACG, AUA and AUU. While non-AUG initiation has been experimentally verified for a handful of human genes, the full extent to which this phenomenon is utilized--both for increased coding capacity and potentially also for novel regulatory mechanisms--remains unclear. To address this issue, and hence to improve the quality of existing coding sequence annotations, we developed a methodology based on phylogenetic analysis of predicted 5\\' untranslated regions from orthologous genes. We use evolutionary signatures of protein-coding sequences as an indicator of translation initiation upstream of annotated coding sequences. Our search identified novel conserved potential non-AUG-initiated N-terminal extensions in 42 human genes including VANGL2, FGFR1, KCNN4, TRPV6, HDGF, CITED2, EIF4G3 and NTF3, and also affirmed the conservation of known non-AUG-initiated extensions in 17 other genes. In several instances, we have been able to obtain independent experimental evidence of the expression of non-AUG-initiated products from the previously published literature and ribosome profiling data.

  11. Long non-coding RNA discovery across the genus anopheles reveals conserved secondary structures within and beyond the Gambiae complex.

    Science.gov (United States)

    Jenkins, Adam M; Waterhouse, Robert M; Muskavitch, Marc A T

    2015-04-23

    Long non-coding RNAs (lncRNAs) have been defined as mRNA-like transcripts longer than 200 nucleotides that lack significant protein-coding potential, and many of them constitute scaffolds for ribonucleoprotein complexes with critical roles in epigenetic regulation. Various lncRNAs have been implicated in the modulation of chromatin structure, transcriptional and post-transcriptional gene regulation, and regulation of genomic stability in mammals, Caenorhabditis elegans, and Drosophila melanogaster. The purpose of this study is to identify the lncRNA landscape in the malaria vector An. gambiae and assess the evolutionary conservation of lncRNAs and their secondary structures across the Anopheles genus. Using deep RNA sequencing of multiple Anopheles gambiae life stages, we have identified 2,949 lncRNAs and more than 300 previously unannotated putative protein-coding genes. The lncRNAs exhibit differential expression profiles across life stages and adult genders. We find that across the genus Anopheles, lncRNAs display much lower sequence conservation than protein-coding genes. Additionally, we find that lncRNA secondary structure is highly conserved within the Gambiae complex, but diverges rapidly across the rest of the genus Anopheles. This study offers one of the first lncRNA secondary structure analyses in vector insects. Our description of lncRNAs in An. gambiae offers the most comprehensive genome-wide insights to date into lncRNAs in this vector mosquito, and defines a set of potential targets for the development of vector-based interventions that may further curb the human malaria burden in disease-endemic countries.

  12. Analysis Code for High Gradient Dielectric Insulator Surface Breakdown

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc.; Verboncoeur, John [University of California - Berkeley; Aldan, Manuel [University of California, Berkeley

    2010-05-30

    High voltage (HV) insulators are critical components in high-energy, accelerator and pulsed power systems that drive diverse applications in the national security, nuclear weapons science, defense and industrial arenas. In these systems, the insulator may separate vacuum/non-vacuum regions or conductors with high electrical field gradients. These insulators will often fail at electric fields over an order of magnitude lower than their intrinsic dielectric strength due to flashover at the dielectric interface. Decades of studies have produced a wealth of information on fundamental processes and mechanisms important for flashover initiation, but only for relatively simple insulator configurations in controlled environments. Accelerator and pulsed power system designers are faced with applying the fundamental knowledge to complex, operational devices with escalating HV requirements. Designers are forced to rely on “best practices” and expensive prototype testing, providing boundaries for successful operation. However, the safety margin is difficult to estimate, and system design must be very conservative for situations where testing is not practicable, or replacement of failed parts is disruptive or expensive. The Phase I program demonstrated the feasibility of developing an advanced code for modeling insulator breakdown. Such a code would be of great interest for a number of applications, including high energy physics, microwave source development, fusion sciences, and other research and industrial applications using high voltage devices.

  13. Evolutionary growth process of highly conserved sequences in vertebrate genomes.

    Science.gov (United States)

    Ishibashi, Minaka; Noda, Akiko Ogura; Sakate, Ryuichi; Imanishi, Tadashi

    2012-08-01

    Genome sequence comparison between evolutionarily distant species revealed ultraconserved elements (UCEs) among mammals under strong purifying selection. Most of them were also conserved among vertebrates. Because they tend to be located in the flanking regions of developmental genes, they would have fundamental roles in creating vertebrate body plans. However, the evolutionary origin and selection mechanism of these UCEs remain unclear. Here we report that UCEs arose in primitive vertebrates, and gradually grew in vertebrate evolution. We searched for UCEs in two teleost fishes, Tetraodon nigroviridis and Oryzias latipes, and found 554 UCEs with 100% identity over 100 bps. Comparison of teleost and mammalian UCEs revealed 43 pairs of common, jawed-vertebrate UCEs (jUCE) with high sequence identities, ranging from 83.1% to 99.2%. Ten of them retain lower similarities to the Petromyzon marinus genome, and the substitution rates of four non-exonic jUCEs were reduced after the teleost-mammal divergence, suggesting that robust conservation had been acquired in the jawed vertebrate lineage. Our results indicate that prototypical UCEs originated before the divergence of jawed and jawless vertebrates and have been frozen as perfect conserved sequences in the jawed vertebrate lineage. In addition, our comparative sequence analyses of UCEs and neighboring regions resulted in a discovery of lineage-specific conserved sequences. They were added progressively to prototypical UCEs, suggesting step-wise acquisition of novel regulatory roles. Our results indicate that conserved non-coding elements (CNEs) consist of blocks with distinct evolutionary history, each having been frozen since different evolutionary era along the vertebrate lineage. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Non-conservative evolution in short-period interacting binaries with the BINSTAR code

    Science.gov (United States)

    Deschamps, Romain; Siess, Lionel; Braun, Killian; Jorissen, Alain; Davis, Philip

    2014-09-01

    Systemic mass loss in interacting binaries such as of the Algol type has been inferred since the 1950s. There is indeed gathering indirect evidence indicating that some Algols follow non-conservative evolution, but still no direct detection of large mass outflows. As a result, little is known about the eventual ejection mechanism, the total amount of mass ejected or the specific angular momentum carried with this outflow. In order to reconcile stellar models and observations, we compute Algol models with the state-of-the-art binary star evolution code BINSTAR. We investigate systemic mass losses within the hotspot paradigm, where large outflows of material form from the accretion impact during the mass transfer phase. We then study the impact of this outflow on the spectral emission distribution of the system with the radiative transfer codes CLOUDY and SKIRT.

  15. Conservation and losses of non-coding RNAs in avian genomes.

    Directory of Open Access Journals (Sweden)

    Paul P Gardner

    Full Text Available Here we present the results of a large-scale bioinformatics annotation of non-coding RNA loci in 48 avian genomes. Our approach uses probabilistic models of hand-curated families from the Rfam database to infer conserved RNA families within each avian genome. We supplement these annotations with predictions from the tRNA annotation tool, tRNAscan-SE and microRNAs from miRBase. We identify 34 lncRNA-associated loci that are conserved between birds and mammals and validate 12 of these in chicken. We report several intriguing cases where a reported mammalian lncRNA, but not its function, is conserved. We also demonstrate extensive conservation of classical ncRNAs (e.g., tRNAs and more recently discovered ncRNAs (e.g., snoRNAs and miRNAs in birds. Furthermore, we describe numerous "losses" of several RNA families, and attribute these to either genuine loss, divergence or missing data. In particular, we show that many of these losses are due to the challenges associated with assembling avian microchromosomes. These combined results illustrate the utility of applying homology-based methods for annotating novel vertebrate genomes.

  16. Complexity-aware high efficiency video coding

    CERN Document Server

    Correa, Guilherme; Agostini, Luciano; Cruz, Luis A da Silva

    2016-01-01

    This book discusses computational complexity of High Efficiency Video Coding (HEVC) encoders with coverage extending from the analysis of HEVC compression efficiency and computational complexity to the reduction and scaling of its encoding complexity. After an introduction to the topic and a review of the state-of-the-art research in the field, the authors provide a detailed analysis of the HEVC encoding tools compression efficiency and computational complexity.  Readers will benefit from a set of algorithms for scaling the computational complexity of HEVC encoders, all of which take advantage from the flexibility of the frame partitioning structures allowed by the standard.  The authors also provide a set of early termination methods based on data mining and machine learning techniques, which are able to reduce the computational complexity required to find the best frame partitioning structures. The applicability of the proposed methods is finally exemplified with an encoding time control system that emplo...

  17. Variation in conserved non-coding sequences on chromosome 5q and susceptibility to asthma and atopy

    Directory of Open Access Journals (Sweden)

    Dubchak Inna

    2005-12-01

    Full Text Available Abstract Background Evolutionarily conserved sequences likely have biological function. Methods To determine whether variation in conserved sequences in non-coding DNA contributes to risk for human disease, we studied six conserved non-coding elements in the Th2 cytokine cluster on human chromosome 5q31 in a large Hutterite pedigree and in samples of outbred European American and African American asthma cases and controls. Results Among six conserved non-coding elements (>100 bp, >70% identity; human-mouse comparison, we identified one single nucleotide polymorphism (SNP in each of two conserved elements and six SNPs in the flanking regions of three conserved elements. We genotyped our samples for four of these SNPs and an additional three SNPs each in the IL13 and IL4 genes. While there was only modest evidence for association with single SNPs in the Hutterite and European American samples (P IL4 gene (P IL13 gene was strongly associated with total IgE (P = 0.00022 and allergic sensitization to mold allergens (P = 0.00076 in the Hutterites, and more modestly associated with sensitization to molds in the European Americans and African Americans (P Conclusion These results indicate that there is overall little variation in the conserved non-coding elements on 5q31, but variation in IL4 and IL13, including possibly one SNP in a conserved element, influence asthma and atopic phenotypes in diverse populations.

  18. CONDOR: a database resource of developmentally associated conserved non-coding elements

    Directory of Open Access Journals (Sweden)

    Smith Sarah

    2007-08-01

    Full Text Available Abstract Background Comparative genomics is currently one of the most popular approaches to study the regulatory architecture of vertebrate genomes. Fish-mammal genomic comparisons have proved powerful in identifying conserved non-coding elements likely to be distal cis-regulatory modules such as enhancers, silencers or insulators that control the expression of genes involved in the regulation of early development. The scientific community is showing increasing interest in characterizing the function, evolution and language of these sequences. Despite this, there remains little in the way of user-friendly access to a large dataset of such elements in conjunction with the analysis and the visualization tools needed to study them. Description Here we present CONDOR (COnserved Non-coDing Orthologous Regions available at: http://condor.fugu.biology.qmul.ac.uk. In an interactive and intuitive way the website displays data on > 6800 non-coding elements associated with over 120 early developmental genes and conserved across vertebrates. The database regularly incorporates results of ongoing in vivo zebrafish enhancer assays of the CNEs carried out in-house, which currently number ~100. Included and highlighted within this set are elements derived from duplication events both at the origin of vertebrates and more recently in the teleost lineage, thus providing valuable data for studying the divergence of regulatory roles between paralogs. CONDOR therefore provides a number of tools and facilities to allow scientists to progress in their own studies on the function and evolution of developmental cis-regulation. Conclusion By providing access to data with an approachable graphics interface, the CONDOR database presents a rich resource for further studies into the regulation and evolution of genes involved in early development.

  19. Chromosome conformation capture uncovers potential genome-wide interactions between human conserved non-coding sequences.

    Directory of Open Access Journals (Sweden)

    Daniel Robyr

    Full Text Available Comparative analyses of various mammalian genomes have identified numerous conserved non-coding (CNC DNA elements that display striking conservation among species, suggesting that they have maintained specific functions throughout evolution. CNC function remains poorly understood, although recent studies have identified a role in gene regulation. We hypothesized that the identification of genomic loci that interact physically with CNCs would provide information on their functions. We have used circular chromosome conformation capture (4C to characterize interactions of 10 CNCs from human chromosome 21 in K562 cells. The data provide evidence that CNCs are capable of interacting with loci that are enriched for CNCs. The number of trans interactions varies among CNCs; some show interactions with many loci, while others interact with few. Some of the tested CNCs are capable of driving the expression of a reporter gene in the mouse embryo, and associate with the oligodendrocyte genes OLIG1 and OLIG2. Our results underscore the power of chromosome conformation capture for the identification of targets of functional DNA elements and raise the possibility that CNCs exert their functions by physical association with defined genomic regions enriched in CNCs. These CNC-CNC interactions may in part explain their stringent conservation as a group of regulatory sequences.

  20. 76 FR 82075 - Highly Erodible Land and Wetland Conservation

    Science.gov (United States)

    2011-12-30

    ... Secretary 7 CFR Part 12 RIN 0560-AH97 Highly Erodible Land and Wetland Conservation AGENCY: Office of the... (HELC) or wetland conservation (WC) provisions to retain eligibility for USDA program benefits if... persons who failed to apply a conservation system on highly erodible land, or who converted wetlands...

  1. Charge conservation effects for high order fluctuations

    CERN Document Server

    Begun, Viktor

    2016-01-01

    The exact charge conservation significantly impacts multiplicity fluctuations. The result depends strongly on the part of the system charge carried by the particles of interest. Along with the expected suppression of fluctuations for large systems, charge conservation may lead to negative skewness or kurtosis for small systems.

  2. High performance word level sequential and parallel coding methods and architectures for bit plane coding

    Institute of Scientific and Technical Information of China (English)

    XIONG ChengYi; TIAN JinWen; LIU Jian

    2008-01-01

    This paper introduced a novel high performance algorithm and VLSI architectures for achieving bit plane coding (BPC) in word level sequential and parallel mode. The proposed BPC algorithm adopts the techniques of coding pass prediction and par-allel & pipeline to reduce the number of accessing memory and to increase the ability of concurrently processing of the system, where all the coefficient bits of a code block could be coded by only one scan. A new parallel bit plane architecture (PA) was proposed to achieve word-level sequential coding. Moreover, an efficient high-speed architecture (HA) was presented to achieve multi-word parallel coding. Compared to the state of the art, the proposed PA could reduce the hardware cost more efficiently, though the throughput retains one coefficient coded per clock. While the proposed HA could perform coding for 4 coefficients belonging to a stripe column at one intra-clock cycle, so that coding for an N×N code-block could be completed in approximate N2/4 intra-clock cycles. Theoretical analysis and ex-perimental results demonstrate that the proposed designs have high throughput rate with good performance in terms of speedup to cost, which can be good alter-natives for low power applications.

  3. Building Energy Efficiency in India: Compliance Evaluation of Energy Conservation Building Code

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Sha; Evans, Meredydd; Delgado, Alison

    2014-03-26

    India is experiencing unprecedented construction boom. The country doubled its floorspace between 2001 and 2005 and is expected to add 35 billion m2 of new buildings by 2050. Buildings account for 35% of total final energy consumption in India today, and building energy use is growing at 8% annually. Studies have shown that carbon policies will have little effect on reducing building energy demand. Chaturvedi et al. predicted that, if there is no specific sectoral policies to curb building energy use, final energy demand of the Indian building sector will grow over five times by the end of this century, driven by rapid income and population growth. The growing energy demand in buildings is accompanied by a transition from traditional biomass to commercial fuels, particularly an increase in electricity use. This also leads to a rapid increase in carbon emissions and aggravates power shortage in India. Growth in building energy use poses challenges to the Indian government. To curb energy consumption in buildings, the Indian government issued the Energy Conservation Building Code (ECBC) in 2007, which applies to commercial buildings with a connected load of 100 kW or 120kVA. It is predicted that the implementation of ECBC can help save 25-40% of energy, compared to reference buildings without energy-efficiency measures. However, the impact of ECBC depends on the effectiveness of its enforcement and compliance. Currently, the majority of buildings in India are not ECBC-compliant. The United Nations Development Programme projected that code compliance in India would reach 35% by 2015 and 64% by 2017. Whether the projected targets can be achieved depends on how the code enforcement system is designed and implemented. Although the development of ECBC lies in the hands of the national government – the Bureau of Energy Efficiency under the Ministry of Power, the adoption and implementation of ECBC largely relies on state and local governments. Six years after ECBC

  4. The conservation and application of three hypothetical protein coding gene for direct detection of Mycobacterium tuberculosis in sputum specimens.

    Directory of Open Access Journals (Sweden)

    Lianhua Qin

    Full Text Available BACKGROUND: Accurate and early diagnosis of tuberculosis (TB is of major importance in the control of TB. One of the most important technical advances in diagnosis of tuberculosis is the development of nucleic acid amplification (NAA tests. However, the choice of the target sequence remains controversial in NAA tests. Recently, interesting alternatives have been found in hypothetical protein coding sequences from mycobacterial genome. METHODOLOGY/PRINCIPAL FINDINGS: To obtain rational biomarker for TB diagnosis, the conservation of three hypothetical genes was firstly evaluated in 714 mycobacterial strains. The results showed that SCAR1 (Sequenced Characterized Amplified Region based on Rv0264c coding gene showed the highest conservation (99.8% and SCAR2 based on Rv1508c gene showed the secondary high conservation (99.7% in M. tuberculosis (MTB strains. SCAR3 based on Rv2135c gene (3.2% and IS6110 (8% showed relatively high deletion rate in MTB strains. Secondly, three SCAR markers were evaluated in 307 clinical sputum from patients in whom TB was suspected or patients with diseases other than TB. The amplification of IS6110 and 16SrRNA sequences together with both clinical and bacteriological identification was as a protocol to evaluate the efficacy of SCAR markers. The sensitivities and specificities, positive predictive value (PPV and negative predictive value (NPV of all NAA tests were higher than those of bacteriological detection. In four NAA tests, IS6110 and SCAR3 showed the highest PPV (100% and low NPV (70% and 68.8%, respectively, and SCAR1 and SCAR2 showed the relatively high PPV and NPV (97% and 82.6%, 95.6% and 88.8%, respectively. CONCLUSIONS/SIGNIFICANCE: Our result indicated that SCAR1 and SCAR2 with a high degree of sequence conservation represent efficient and promising alternatives as NAA test targets in identification of MTB. Moreover, the targets developed from this study may provide more alternative targets for the

  5. The Chlamydophila felis plasmid is highly conserved.

    Science.gov (United States)

    Harley, Ross; Day, Sarinder; Di Rocco, Camillo; Helps, Chris

    2010-11-20

    The presence of a plasmid in the Chlamydiaceae is both species and strain specific. Knowledge of the prevalence of the plasmid in different Chlamydia species is important for future studies aiming to investigate the role of the plasmid in chlamydial biology and disease. Although strains of Chlamydophila felis with or without the plasmid have been identified, only a small number of laboratory-adapted strains have been analysed and the prevalence of the plasmid in field isolates has not been determined. This study aimed to determine the prevalence of the plasmid in C. felis-positive conjunctival and oropharyngeal clinical samples submitted for routine diagnosis of C. felis by real-time (Q)PCR. DNA extracts from four laboratory-adapted strains were also analysed. QPCR assays targeting regions of C. felis plasmid genes pCF01, pCF02 and pCF03 were developed for the detection of plasmid DNA. QPCR analysis of DNA extracts from C. felis-positive clinical samples found evidence of plasmid DNA in 591 of 595 samples representing 561 of 564 (99.5%) clinical cases. Plasmid DNA was also detected by QPCR in laboratory-adapted strains 1497V, K2487 and K2490, but not strain 905. We conclude that the plasmid is highly conserved in C. felis, and plasmid-deficient strains represent a rare but important population for future studies of chlamydial plasmid function.

  6. Readings in Wildlife and Fish Conservation, High School Conservation Curriculum Project.

    Science.gov (United States)

    Ensminger, Jack

    This publication is a tentative edition of readings on Wildlife and Fish Conservation in Louisiana, and as such it forms part of one of the four units of study designed for an experimental high school course, the "High School Conservation Curriculum Project." The other three units are concerned with Forest Conervation, Soil and Water…

  7. Current status of high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Sasa, Toshinobu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Current status of design code of accelerator (NMTC/JAERI code), outline of physical model and evaluation of accuracy of code were reported. To evaluate the nuclear performance of accelerator and strong spallation neutron origin, the nuclear reaction between high energy proton and target nuclide and behaviors of various produced particles are necessary. The nuclear design of spallation neutron system used a calculation code system connected the high energy nucleon{center_dot}meson transport code and the neutron{center_dot}photon transport code. NMTC/JAERI is described by the particle evaporation process under consideration of competition reaction of intranuclear cascade and fission process. Particle transport calculation was carried out for proton, neutron, {pi}- and {mu}-meson. To verify and improve accuracy of high energy nucleon-meson transport code, data of spallation and spallation neutron fragment by the integral experiment were collected. (S.Y.)

  8. A conserved influenza A virus nucleoprotein code controls specific viral genome packaging

    Science.gov (United States)

    Moreira, Étori Aguiar; Weber, Anna; Bolte, Hardin; Kolesnikova, Larissa; Giese, Sebastian; Lakdawala, Seema; Beer, Martin; Zimmer, Gert; García-Sastre, Adolfo; Schwemmle, Martin; Juozapaitis, Mindaugas

    2016-01-01

    Packaging of the eight genomic RNA segments of influenza A viruses (IAV) into viral particles is coordinated by segment-specific packaging sequences. How the packaging signals regulate the specific incorporation of each RNA segment into virions and whether other viral or host factors are involved in this process is unknown. Here, we show that distinct amino acids of the viral nucleoprotein (NP) are required for packaging of specific RNA segments. This was determined by studying the NP of a bat influenza A-like virus, HL17NL10, in the context of a conventional IAV (SC35M). Replacement of conserved SC35M NP residues by those of HL17NL10 NP resulted in RNA packaging defective IAV. Surprisingly, substitution of these conserved SC35M amino acids with HL17NL10 NP residues led to IAV with altered packaging efficiencies for specific subsets of RNA segments. This suggests that NP harbours an amino acid code that dictates genome packaging into infectious virions. PMID:27650413

  9. High-fidelity coding with correlated neurons.

    Directory of Open Access Journals (Sweden)

    Rava Azeredo da Silveira

    2014-11-01

    Full Text Available Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded--the capacity--can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a 'lock-in' of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it.

  10. Conserved syntenic clusters of protein coding genes are missing in birds

    OpenAIRE

    Lovell, Peter V.; Wirthlin, Morgan; Wilhelm, Larry; Minx, Patrick; Lazar, Nathan H.; Carbone, Lucia; Warren, Wesley C.; Mello, Claudio V.

    2014-01-01

    Background Birds are one of the most highly successful and diverse groups of vertebrates, having evolved a number of distinct characteristics, including feathers and wings, a sturdy lightweight skeleton and unique respiratory and urinary/excretion systems. However, the genetic basis of these traits is poorly understood. Results Using comparative genomics based on extensive searches of 60 avian genomes, we have found that birds lack approximately 274 protein coding genes that are present in th...

  11. Beyond the High Point Code in Testing Holland's Theory

    Science.gov (United States)

    Andrews, Hans A.

    1975-01-01

    This study was designed to test and expand Holland's vocational development theory by utilizing more than a single high point code in classification of personality patterns of jobs. A more "refined" and/or "subtle" difference was shown in the personality-job relationships when two high point codes were used. (Author)

  12. Translation Initiation from Conserved Non-AUG Codons Provides Additional Layers of Regulation and Coding Capacity

    Directory of Open Access Journals (Sweden)

    Ivaylo P. Ivanov

    2017-06-01

    Full Text Available Neurospora crassa cpc-1 and Saccharomyces cerevisiae GCN4 are homologs specifying transcription activators that drive the transcriptional response to amino acid limitation. The cpc-1 mRNA contains two upstream open reading frames (uORFs in its >700-nucleotide (nt 5′ leader, and its expression is controlled at the level of translation in response to amino acid starvation. We used N. crassa cell extracts and obtained data indicating that cpc-1 uORF1 and uORF2 are functionally analogous to GCN4 uORF1 and uORF4, respectively, in controlling translation. We also found that the 5′ region upstream of the main coding sequence of the cpc-1 mRNA extends for more than 700 nucleotides without any in-frame stop codon. For 100 cpc-1 homologs from Pezizomycotina and from selected Basidiomycota, 5′ conserved extensions of the CPC1 reading frame are also observed. Multiple non-AUG near-cognate codons (NCCs in the CPC1 reading frame upstream of uORF2, some deeply conserved, could potentially initiate translation. At least four NCCs initiated translation in vitro. In vivo data were consistent with initiation at NCCs to produce N-terminally extended N. crassa CPC1 isoforms. The pivotal role played by CPC1, combined with its translational regulation by uORFs and NCC utilization, underscores the emerging significance of noncanonical initiation events in controlling gene expression.

  13. Victims of conservation or rights as forest dwellers: Van Gujjar pastoralists between contesting codes of law

    Directory of Open Access Journals (Sweden)

    Gooch Pernille

    2009-01-01

    Full Text Available The Van (forest Gujjars, surviving as forest pastoralists in the central part of the Indian Himalaya, are a people who, due to their nomadic lifestyle, have since colonial rule found themselves at the margin of Indian society. This paper will look at the relationship between the Van Gujjars and their forest base in a historical perspective from colonial rule to ′conservation of nature′ and the ′rights of forest dwellers′ and further discuss how changing codes and rules of power affect the society-citizen-nature / forest relationship for the community. We will look back into history and see how a system of strict control and regulation of Van Gujjars as nomadic pastoralists without a fixed address, initiated during colonial time, was continued by the national state of India after independence. We will further discuss how a history of unequal treatment and marginalisation of Van Gujjar pastoralists has continued into the present. What is manifest here is ′the forest′ as a contested space: a site of power struggles, where forest dwellers are threatened with displacement in order to provide space, first for modern forestry and revenue producing land, and later for conservation of nature. The paper further looks at the latest developments where the Van Gujjars now have obtained domicile rights such as voters′ rights and have been linked with Government services for education and health. It finishes by discussing the new possibilities and hopes for the community provided by the The Scheduled Tribes and Other Traditional Forest Dwellers (Recognition of Forest Rights Act.

  14. RAISHIN: A High-Resolution Three-Dimensional General Relativistic Magnetohydrodynamics Code

    CERN Document Server

    Mizuno, Y; Koide, S; Hardee, P; Fishman, G J; Mizuno, Yosuke; Nishikawa, Ken-Ichi; Koide, Shinji; Hardee, Philip; Fishman, Gerald J.

    2006-01-01

    We have developed a new three-dimensional general relativistic magnetohydrodynamic (GRMHD) code, RAISHIN, using a conservative, high resolution shock-capturing scheme. The numerical fluxes are calculated using the Harten, Lax, & van Leer (HLL) approximate Riemann solver scheme. The flux-interpolated, constrained transport scheme is used to maintain a divergence-free magnetic field. In order to examine the numerical accuracy and the numerical efficiency, the code uses four different reconstruction methods: piecewise linear methods with Minmod and MC slope-limiter function, convex essentially non-oscillatory (CENO) method, and piecewise parabolic method (PPM) using multistep TVD Runge-Kutta time advance methods with second and third-order time accuracy. We describe code performance on an extensive set of test problems in both special and general relativity. Our new GRMHD code has proven to be accurate in second order and has successfully passed with all tests performed, including highly relativistic and mag...

  15. RAISHIN: A High-Resolution Three-Dimensional General Relativistic Magnetohydrodynamics Code

    Science.gov (United States)

    Mizuno, Yosuke; Nishikawa, Ken-Ichi; Koide, Shinji; Hardee, Philip; Fishman, Gerald J.

    2006-01-01

    We have developed a new three-dimensional general relativistic magnetohydrodynamic (GRMHD) code, RAISHIN, using a conservative, high resolution shock-capturing scheme. The numerical fluxes are calculated using the Harten, Lax, & van Leer (HLL) approximate Riemann solver scheme. The flux-interpolated, constrained transport scheme is used to maintain a divergence-free magnetic field. In order to examine the numerical accuracy and the numerical efficiency, the code uses four different reconstruction methods: piecewise linear methods with Minmod and MC slope-limiter function, convex essentially non-oscillatory (CENO) method, and piecewise parabolic method (PPM) using multistep TVD Runge-Kutta time advance methods with second and third-order time accuracy. We describe code performance on an extensive set of test problems in both special and general relativity. Our new GRMHD code has proven to be accurate in second order and has successfully passed with all tests performed, including highly relativistic and magnetized cases in both special and general relativity.

  16. Hierarchical and High-Girth QC LDPC Codes

    CERN Document Server

    Wang, Yige; Yedidia, Jonathan S

    2011-01-01

    We present a general approach to designing capacity-approaching high-girth low-density parity-check (LDPC) codes that are friendly to hardware implementation. Our methodology starts by defining a new class of "hierarchical" quasi-cyclic (HQC) LDPC codes that generalizes the structure of quasi-cyclic (QC) LDPC codes. Whereas the parity check matrices of QC LDPC codes are composed of circulant sub-matrices, those of HQC LDPC codes are composed of a hierarchy of circulant sub-matrices that are in turn constructed from circulant sub-matrices, and so on, through some number of levels. We show how to map any class of codes defined using a protograph into a family of HQC LDPC codes. Next, we present a girth-maximizing algorithm that optimizes the degrees of freedom within the family of codes to yield a high-girth HQC LDPC code. Finally, we discuss how certain characteristics of a code protograph will lead to inevitable short cycles, and show that these short cycles can be eliminated using a "squashing" procedure tha...

  17. Consistent levels of A-to-I RNA editing across individuals in coding sequences and non-conserved Alu repeats

    Directory of Open Access Journals (Sweden)

    Osenberg Sivan

    2010-10-01

    Full Text Available Abstract Background Adenosine to inosine (A-to-I RNA-editing is an essential post-transcriptional mechanism that occurs in numerous sites in the human transcriptome, mainly within Alu repeats. It has been shown to have consistent levels of editing across individuals in a few targets in the human brain and altered in several human pathologies. However, the variability across human individuals of editing levels in other tissues has not been studied so far. Results Here, we analyzed 32 skin samples, looking at A-to-I editing level in three genes within coding sequences and in the Alu repeats of six different genes. We observed highly consistent editing levels across different individuals as well as across tissues, not only in coding targets but, surprisingly, also in the non evolutionary conserved Alu repeats. Conclusions Our findings suggest that A-to-I RNA-editing of Alu elements is a tightly regulated process and, as such, might have been recruited in the course of primate evolution for post-transcriptional regulatory mechanisms.

  18. Semi-implicit scheme for treating radiation under M1 closure in general relativistic conservative fluid dynamics codes

    CERN Document Server

    Sdowski, Aleksander; Tchekhovskoy, Alexander; Zhu, Yucong

    2012-01-01

    A numerical scheme is described for including radiation in multi-dimensional general-relativistic conservative fluid dynamics codes. In this method, a covariant form of the M1 closure scheme is used to close the radiation moments, and the radiative source terms are treated semi-implicitly in order to handle both optically thin and optically thick regimes. The scheme has been implemented in a conservative general relativistic radiation hydrodynamics code KORAL. The robustness of the code is demonstrated on a number of test problems, including radiative relativistic shock tubes, static radiation pressure supported atmosphere, shadows, beams of light in curved spacetime, and radiative Bondi accretion. The advantages of M1 closure relative to other approaches such as Eddington closure and flux-limited diffusion are discussed, and its limitations are also highlighted.

  19. Sigma: multiple alignment of weakly-conserved non-coding DNA sequence

    Directory of Open Access Journals (Sweden)

    Siddharthan Rahul

    2006-03-01

    Full Text Available Abstract Background Existing tools for multiple-sequence alignment focus on aligning protein sequence or protein-coding DNA sequence, and are often based on extensions to Needleman-Wunsch-like pairwise alignment methods. We introduce a new tool, Sigma, with a new algorithm and scoring scheme designed specifically for non-coding DNA sequence. This problem acquires importance with the increasing number of published sequences of closely-related species. In particular, studies of gene regulation seek to take advantage of comparative genomics, and recent algorithms for finding regulatory sites in phylogenetically-related intergenic sequence require alignment as a preprocessing step. Much can also be learned about evolution from intergenic DNA, which tends to evolve faster than coding DNA. Sigma uses a strategy of seeking the best possible gapless local alignments (a strategy earlier used by DiAlign, at each step making the best possible alignment consistent with existing alignments, and scores the significance of the alignment based on the lengths of the aligned fragments and a background model which may be supplied or estimated from an auxiliary file of intergenic DNA. Results Comparative tests of sigma with five earlier algorithms on synthetic data generated to mimic real data show excellent performance, with Sigma balancing high "sensitivity" (more bases aligned with effective filtering of "incorrect" alignments. With real data, while "correctness" can't be directly quantified for the alignment, running the PhyloGibbs motif finder on pre-aligned sequence suggests that Sigma's alignments are superior. Conclusion By taking into account the peculiarities of non-coding DNA, Sigma fills a gap in the toolbox of bioinformatics.

  20. Introduction to the High-Efficiency Video Coding Standard

    Institute of Scientific and Technical Information of China (English)

    Ping Wu; Mina Li

    2012-01-01

    The high-efficiency video coding (HEVC) standard is the newest video coding standard currently under joint development by ITU-T Video Coding Experts Group (VCEG) and ISO/IEC Moving Picture Experts Group (MPEG). HEVC is the next-generation video coding standard after H.264/AVC. The goals of the HEVC standardization effort are to double the video coding efficiency of existing H.264/AVC while supporting all the recognized potential applications, such as, video telephony, storage, broadcast, streaming, especially for large picture size video (4k x 2k). The HEVC standard will be completed as an ISO/iEC and ITU-T standard in January 2013. in February 2012, the HEVC standardization process reached its committee draft (CD) stage. The ever-improving HEVC standard has demonstrated a significant gain in coding efficiency in rate-distortion efficiency relative to the existing H.264/AVC. This paper provides an overview of the technical features of HEVC close to HEVC CD stage, covering high-level structure, coding units, prediction units, transform units, spatial signal transformation and PCM representation, intra-picture prediction, inter-picture prediction, entropy coding and in-loop filtering. The HEVC coding efficiency performances comparing with H.264/AVC are also provided.

  1. High order and conservative method for patched grid interfaces

    OpenAIRE

    Maugars, B.; Michel, B.; Cinnella, P.

    2014-01-01

    International audience; A high-order and conservative method is developed for the numerical treatment of interface conditions in patched grids, based on the use of a ctitious grid methodology. The proposed approach is compared with a non-conservative interpolation of the state variables from the neighbouring domain for selected internal fow problems.

  2. High-fidelity plasma codes for burn physics

    Energy Technology Data Exchange (ETDEWEB)

    Cooley, James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Graziani, Frank [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Marinak, Marty [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Murillo, Michael [Michigan State Univ., East Lansing, MI (United States)

    2016-10-19

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental data and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.

  3. High-fidelity plasma codes for burn physics

    Energy Technology Data Exchange (ETDEWEB)

    Cooley, James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Graziani, Frank [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Marinak, Marty [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Murillo, Michael [Michigan State Univ., East Lansing, MI (United States)

    2016-10-19

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiationhydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of selfheating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes are a relatively recent computational tool that augments both experimental data and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.

  4. A conservative orbital advection scheme for simulations of magnetized shear flows with the PLUTO code

    Science.gov (United States)

    Mignone, A.; Flock, M.; Stute, M.; Kolb, S. M.; Muscianisi, G.

    2012-09-01

    Context. Explicit numerical computations of hypersonic or super-fast differentially rotating disks are subject to the time-step constraint imposed by the Courant condition, according to which waves cannot travel more than a fraction of a cell during a single time-step update. When the bulk orbital velocity largely exceeds any other wave speed (e.g., sound or Alfvén), as computed in the rest frame, the time step is considerably reduced and an unusually large number of steps may be necessary to complete the computation. Aims: We present a robust numerical scheme to overcome the Courant limitation by improving and extending the algorithm previously known as FARGO (fast advection in rotating gaseous objects) to the equations of magnetohydrodynamics (MHD) using a more general formalism. The proposed scheme conserves total angular momentum and energy to machine precision and works in cartesian, cylindrical, or spherical coordinates. The algorithm has been implemented in the next release of the PLUTO code for astrophysical gasdynamics and is suitable for local or global simulations of accretion or proto-planetary disk models. Methods: By decomposing the total velocity into an average azimuthal contribution and a residual term, the algorithm approaches the solution of the MHD equations through two separate steps corresponding to a linear transport operator in the direction of orbital motion and a standard nonlinear solver applied to the MHD equations written in terms of the residual velocity. Since the former step is not subject to any stability restriction, the Courant condition is computed only in terms of the residual velocity, leading to substantially larger time steps. The magnetic field is advanced in time using the constrained transport method in order to fulfill the divergence-free condition. Furthermore, conservation of total energy and angular momentum is enforced at the discrete level by properly expressing the source terms in terms of upwind Godunov fluxes

  5. Building Code Compliance and Enforcement: The Experience of SanFrancisco's Residential Energy Conservation Ordinanace and California'sBuildign Standards for New Construction

    Energy Technology Data Exchange (ETDEWEB)

    Vine, E.

    1990-11-01

    As part of Lawrence Berkeley Laboratory's (LBL) technical assistance to the Sustainable City Project, compliance and enforcement activities related to local and state building codes for existing and new construction were evaluated in two case studies. The analysis of the City of San Francisco's Residential Energy Conservation Ordinance (RECO) showed that a limited, prescriptive energy conservation ordinance for existing residential construction can be enforced relatively easily with little administrative costs, and that compliance with such ordinances can be quite high. Compliance with the code was facilitated by extensive publicity, an informed public concerned with the cost of energy and knowledgeable about energy efficiency, the threat of punishment (Order of Abatement), the use of private inspectors, and training workshops for City and private inspectors. The analysis of California's Title 24 Standards for new residential and commercial construction showed that enforcement of this type of code for many climate zones is more complex and requires extensive administrative support for education and training of inspectors, architects, engineers, and builders. Under this code, prescriptive and performance approaches for compliance are permitted, resulting in the demand for alternative methods of enforcement: technical assistance, plan review, field inspection, and computer analysis. In contrast to existing construction, building design and new materials and construction practices are of critical importance in new construction, creating a need for extensive technical assistance and extensive interaction between enforcement personnel and the building community. Compliance problems associated with building design and installation did occur in both residential and nonresidential buildings. Because statewide codes are enforced by local officials, these problems may increase over time as energy standards change and become more complex and as other standards

  6. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  7. Comprehensive analysis of long non-coding RNAs highlights their spatio-temporal expression patterns and evolutional conservation in Sus scrofa

    Science.gov (United States)

    Tang, Zhonglin; Wu, Yang; Yang, Yalan; Yang, Yu-Cheng T.; Wang, Zishuai; Yuan, Jiapei; Yang, Yang; Hua, Chaoju; Fan, Xinhao; Niu, Guanglin; Zhang, Yubo; Lu, Zhi John; Li, Kui

    2017-01-01

    Despite modest sequence conservation and rapid evolution, long non-coding RNAs (lncRNAs) appear to be conserved in expression pattern and function. However, analysis of lncRNAs across tissues and developmental stages remains largely uncharacterized in mammals. Here, we systematically investigated the lncRNAs of the Guizhou miniature pig (Sus scrofa), which was widely used as biomedical model. We performed RNA sequencing across 9 organs and 3 developmental skeletal muscle, and developed a filtering pipeline to identify 10,813 lncRNAs (9,075 novel). Conservation patterns analysis revealed that 57% of pig lncRNAs showed homology to humans and mice based on genome alignment. 5,455 lncRNAs exhibited typical hallmarks of regulatory molecules, such as high spatio-temporal specificity. Notably, conserved lncRNAs exhibited higher tissue specificity than pig-specific lncRNAs and were significantly enriched in testis and ovary. Weighted co-expression network analysis revealed a set of conserved lncRNAs that are likely involved in postnatal muscle development. Based on the high degree of similarity in the structure, organization, and dynamic expression of pig lncRNAs compared with human and mouse lncRNAs, we propose that these lncRNAs play an important role in organ physiology and development in mammals. Our results provide a resource for studying animal evolution, morphological complexity, breeding, and biomedical research. PMID:28233874

  8. Comprehensive analysis of long non-coding RNAs highlights their spatio-temporal expression patterns and evolutional conservation in Sus scrofa.

    Science.gov (United States)

    Tang, Zhonglin; Wu, Yang; Yang, Yalan; Yang, Yu-Cheng T; Wang, Zishuai; Yuan, Jiapei; Yang, Yang; Hua, Chaoju; Fan, Xinhao; Niu, Guanglin; Zhang, Yubo; Lu, Zhi John; Li, Kui

    2017-02-24

    Despite modest sequence conservation and rapid evolution, long non-coding RNAs (lncRNAs) appear to be conserved in expression pattern and function. However, analysis of lncRNAs across tissues and developmental stages remains largely uncharacterized in mammals. Here, we systematically investigated the lncRNAs of the Guizhou miniature pig (Sus scrofa), which was widely used as biomedical model. We performed RNA sequencing across 9 organs and 3 developmental skeletal muscle, and developed a filtering pipeline to identify 10,813 lncRNAs (9,075 novel). Conservation patterns analysis revealed that 57% of pig lncRNAs showed homology to humans and mice based on genome alignment. 5,455 lncRNAs exhibited typical hallmarks of regulatory molecules, such as high spatio-temporal specificity. Notably, conserved lncRNAs exhibited higher tissue specificity than pig-specific lncRNAs and were significantly enriched in testis and ovary. Weighted co-expression network analysis revealed a set of conserved lncRNAs that are likely involved in postnatal muscle development. Based on the high degree of similarity in the structure, organization, and dynamic expression of pig lncRNAs compared with human and mouse lncRNAs, we propose that these lncRNAs play an important role in organ physiology and development in mammals. Our results provide a resource for studying animal evolution, morphological complexity, breeding, and biomedical research.

  9. A new relativistic hydrodynamics code for high-energy heavy-ion collisions

    Science.gov (United States)

    Okamoto, Kazuhisa; Akamatsu, Yukinao; Nonaka, Chiho

    2016-10-01

    We construct a new Godunov type relativistic hydrodynamics code in Milne coordinates, using a Riemann solver based on the two-shock approximation which is stable under the existence of large shock waves. We check the correctness of the numerical algorithm by comparing numerical calculations and analytical solutions in various problems, such as shock tubes, expansion of matter into the vacuum, the Landau-Khalatnikov solution, and propagation of fluctuations around Bjorken flow and Gubser flow. We investigate the energy and momentum conservation property of our code in a test problem of longitudinal hydrodynamic expansion with an initial condition for high-energy heavy-ion collisions. We also discuss numerical viscosity in the test problems of expansion of matter into the vacuum and conservation properties. Furthermore, we discuss how the numerical stability is affected by the source terms of relativistic numerical hydrodynamics in Milne coordinates.

  10. A new relativistic hydrodynamics code for high-energy heavy-ion collisions

    Energy Technology Data Exchange (ETDEWEB)

    Okamoto, Kazuhisa [Nagoya University, Department of Physics, Nagoya (Japan); Akamatsu, Yukinao [Nagoya University, Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI), Nagoya (Japan); Osaka University, Department of Physics, Toyonaka (Japan); Stony Brook University, Department of Physics and Astronomy, Stony Brook, NY (United States); Nonaka, Chiho [Nagoya University, Department of Physics, Nagoya (Japan); Nagoya University, Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI), Nagoya (Japan); Duke University, Department of Physics, Durham, NC (United States)

    2016-10-15

    We construct a new Godunov type relativistic hydrodynamics code in Milne coordinates, using a Riemann solver based on the two-shock approximation which is stable under the existence of large shock waves. We check the correctness of the numerical algorithm by comparing numerical calculations and analytical solutions in various problems, such as shock tubes, expansion of matter into the vacuum, the Landau-Khalatnikov solution, and propagation of fluctuations around Bjorken flow and Gubser flow. We investigate the energy and momentum conservation property of our code in a test problem of longitudinal hydrodynamic expansion with an initial condition for high-energy heavy-ion collisions. We also discuss numerical viscosity in the test problems of expansion of matter into the vacuum and conservation properties. Furthermore, we discuss how the numerical stability is affected by the source terms of relativistic numerical hydrodynamics in Milne coordinates. (orig.)

  11. A new relativistic hydrodynamics code for high-energy heavy-ion collisions

    CERN Document Server

    Okamoto, Kazuhisa; Nonaka, Chiho

    2016-01-01

    We construct a new Godunov type relativistic hydrodynamics code in Milne coordinates, using a Riemann solver based on the two-shock approximation which is stable under existence of large shock waves. We check the correctness of the numerical algorithm by comparing numerical calculations and analytical solutions in various problems, such as shock tubes, expansion of matter into the vacuum, Landau-Khalatnikov solution, propagation of fluctuations around Bjorken flow and Gubser flow. We investigate the energy and momentum conservation property of our code in a test problem of longitudinal hydrodynamic expansion with an initial condition for high-energy heavy-ion collisions.We also discuss numerical viscosity in the test problems of expansion of matter into the vacuum and conservation properties. Furthermore, we discuss how the numerical stability is affected by the source terms of relativistic numerical hydrodynamics in Milne coordinates.

  12. Assessing Foundation Insulation Strategies for the 2012 International Energy Conservation Code in Cold Climate New Home Construction

    Energy Technology Data Exchange (ETDEWEB)

    VonThoma, E. [Univ. of Minnesota, St. Paul, MN (United States); Ojczyk, C. [Univ. of Minnesota, St. Paul, MN (United States); Mosiman, G. [Univ. of Minnesota, St. Paul, MN (United States)

    2013-04-01

    While the International Energy Conservation Code 2012 (IECC 2012) has been adopted at a national level, only two cold climate states have adopted it as their new home energy code. Understanding the resistance to adoption is important in assisting more states accept the code and engage deep energy strategies nationwide. This three-part assessment by the NorthernSTAR Building America Partnership was focused on foundation insulation R-values for cold climates and the design, construction, and performance implications. In Part 1 a literature review and attendance at stakeholder meetings held in Minnesota were used to assess general stakeholder interest and concerns regarding proposed code changes. Part 2 includes drawings of robust foundation insulation systems that were presented at one Minnesota stakeholder meeting to address critical issues and concerns for adopting best practice strategies. In Part 3 a sampling of builders participated in a telephone interview to gain baseline knowledge on insulation systems used to meet the current energy code and how the same builders propose to meet the new proposed code.

  13. Assessing Foundation Insulation Strategies for the 2012 International Energy Conservation Code in Cold Climate New Home Construction

    Energy Technology Data Exchange (ETDEWEB)

    VonThoma, E.; Ojczyk, C.; Mosiman, G.

    2013-04-01

    While the International Energy Conservation Code 2012 (IECC 2012) has been adopted at a national level, only two cold climate states have adopted it as their new home energy code. Understanding the resistance to adoption is important in assisting more states accept the code and engage deep energy strategies nationwide. This three-part assessment by the NorthernSTAR Building America Partnership was focused on foundation insulation R-values for cold climates and the design, construction, and performance implications. In Part 1 a literature review and attendance at stakeholder meetings held in Minnesota were used to assess general stakeholder interest and concerns regarding proposed code changes. Part 2 includes drawings of robust foundation insulation systems that were presented at one Minnesota stakeholder meeting to address critical issues and concerns for adopting best practice strategies. In Part 3 a sampling of builders participated in a telephone interview to gain baseline knowledge on insulation systems used to meet the current energy code and how the same builders propose to meet the new proposed code.

  14. A highly specific coding system for structural chromosomal alterations.

    Science.gov (United States)

    Martínez-Frías, M L; Martínez-Fernández, M L

    2013-04-01

    The Spanish Collaborative Study of Congenital Malformations (ECEMC, from the name in Spanish) has developed a very simple and highly specific coding system for structural chromosomal alterations. Such a coding system would be of value at present due to the dramatic increase in the diagnosis of submicroscopic chromosomal deletions and duplications through molecular techniques. In summary, our new coding system allows the characterization of: (a) the type of structural anomaly; (b) the chromosome affected; (c) if the alteration affects the short or/and the long arm, and (d) if it is a non-pure dicentric, a non-pure isochromosome, or if it affects several chromosomes. We show the distribution of 276 newborn patients with these types of chromosomal alterations using their corresponding codes according to our system. We consider that our approach may be useful not only for other registries, but also for laboratories performing these studies to store their results on case series. Therefore, the aim of this article is to describe this coding system and to offer the opportunity for this coding to be applied by others. Moreover, as this is a SYSTEM, rather than a fixed code, it can be implemented with the necessary modifications to include the specific objectives of each program. Copyright © 2013 Wiley Periodicals, Inc.

  15. Furrow Dike Water Conservation Practices in the Texas High Plains

    OpenAIRE

    Wistrand, Glen L.

    1984-01-01

    Furrow diking can prevent irrigation and rainfall runoff, conserve energy use, prevent soil loss, amd allow producers to reclaim land otherwise unusable, depending on soil, climate, and crops grown in a given area. Initial investment to use this technique may be recovered within the first season. This study analyzes the effects of diking on water and soil conservation, crop yields, costs, and energy use in farming, using examples of farms in the Texas High Plains area.

  16. Furrow Dike Water Conservation Practices in the Texas High Plains

    OpenAIRE

    Wistrand, Glen L.

    1984-01-01

    Furrow diking can prevent irrigation and rainfall runoff, conserve energy use, prevent soil loss, amd allow producers to reclaim land otherwise unusable, depending on soil, climate, and crops grown in a given area. Initial investment to use this technique may be recovered within the first season. This study analyzes the effects of diking on water and soil conservation, crop yields, costs, and energy use in farming, using examples of farms in the Texas High Plains area.

  17. Recent developments in standardization of high efficiency video coding (HEVC)

    Science.gov (United States)

    Sullivan, Gary J.; Ohm, Jens-Rainer

    2010-08-01

    This paper reports on recent developments in video coding standardization, particularly focusing on the Call for Proposals (CfP) on video coding technology made jointly in January 2010 by ITU-T VCEG and ISO/IEC MPEG and the April 2010 responses to that Call. The new standardization initiative is referred to as High Efficiency Video Coding (HEVC) and its development has been undertaken by a new Joint Collaborative Team on Video Coding (JCT-VC) formed by the two organizations. The HEVC standard is intended to provide significantly better compression capability than the existing AVC (ITU-T H.264 | ISO/IEC MPEG-4 Part 10) standard. The results of the CfP are summarized, and the first steps towards the definition of the HEVC standard are described.

  18. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    Science.gov (United States)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O’Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  19. High-throughput sequencing, characterization and detection of new and conserved cucumber miRNAs.

    Directory of Open Access Journals (Sweden)

    Germán Martínez

    Full Text Available Micro RNAS (miRNAs are a class of endogenous small non coding RNAs involved in the post-transcriptional regulation of gene expression. In plants, a great number of conserved and specific miRNAs, mainly arising from model species, have been identified to date. However less is known about the diversity of these regulatory RNAs in vegetal species with agricultural and/or horticultural importance. Here we report a combined approach of bioinformatics prediction, high-throughput sequencing data and molecular methods to analyze miRNAs populations in cucumber (Cucumis sativus plants. A set of 19 conserved and 6 known but non-conserved miRNA families were found in our cucumber small RNA dataset. We also identified 7 (3 with their miRNA* strand not previously described miRNAs, candidates to be cucumber-specific. To validate their description these new C. sativus miRNAs were detected by northern blot hybridization. Additionally, potential targets for most conserved and new miRNAs were identified in cucumber genome.In summary, in this study we have identified, by first time, conserved, known non-conserved and new miRNAs arising from an agronomically important species such as C. sativus. The detection of this complex population of regulatory small RNAs suggests that similarly to that observe in other plant species, cucumber miRNAs may possibly play an important role in diverse biological and metabolic processes.

  20. The genetic code and its optimization for kinetic energy conservation in polypeptide chains.

    Science.gov (United States)

    Guilloux, Antonin; Jestin, Jean-Luc

    2012-08-01

    Why is the genetic code the way it is? Concepts from fields as diverse as molecular evolution, classical chemistry, biochemistry and metabolism have been used to define selection pressures most likely to be involved in the shaping of the genetic code. Here minimization of kinetic energy disturbances during protein evolution by mutation allows an optimization of the genetic code to be highlighted. The quadratic forms corresponding to the kinetic energy term are considered over the field of rational numbers. Arguments are given to support the introduction of notions from basic number theory within this context. The observations found to be consistent with this minimization are statistically significant. The genetic code may well have been optimized according to energetic criteria so as to improve folding and dynamic properties of polypeptide chains. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  2. Parallel Beam Dynamics Code Development for High Intensity Cyclotron

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>1 Parallel PIC algorithm Self field solver is the key part of a high intensity beam dynamic PIC code which usually adopts the P-M (Particle-Mesh) method to solve the space charge. The P-M method is composed of four major

  3. MAPCLASS a code to optimize high order aberrations

    CERN Document Server

    Tomás, R

    2006-01-01

    MAPCLASS is a code written in PYTHON conceived to optimize the non-linear aberrations of the Final Focus System of CLIC. MAPCLASS calls MADX-PTC to obtain the map coefficients and uses optimization algorithms like the Simplex to compensate the high order aberrations.

  4. Analysis of isoplanatic high resolution stellar fields by Starfinder code

    CERN Document Server

    Diolaiti, E; Bonaccini, D; Close, L M; Currie, D; Parmeggiani, G

    2000-01-01

    We describe a new code for the deep analysis of stellar fields, designed for Adaptive Optics Nyquist-sampled images with high and low Strehl ratio. The Point Spread Function is extracted directly from the image frame, to take into account the actual structure of the instrumental response and the atmospheric effects. The code is written in IDL language and organized in the form of a self-contained widget-based application, provided with a series of tools for data visualization and analysis. A description of the method and some applications to AO data are presented.

  5. High speed coding for velocity by archerfish retinal ganglion cells

    Directory of Open Access Journals (Sweden)

    Kretschmer Viola

    2012-06-01

    Full Text Available Abstract Background Archerfish show very short behavioural latencies in response to falling prey. This raises the question, which response parameters of retinal ganglion cells to moving stimuli are best suited for fast coding of stimulus speed and direction. Results We compared stimulus reconstruction quality based on the ganglion cell response parameters latency, first interspike interval, and rate. For stimulus reconstruction of moving stimuli using latency was superior to using the other stimulus parameters. This was true for absolute latency, with respect to stimulus onset, as well as for relative latency, with respect to population response onset. Iteratively increasing the number of cells used for reconstruction decreased the calculated error close to zero. Conclusions Latency is the fastest response parameter available to the brain. Therefore, latency coding is best suited for high speed coding of moving objects. The quantitative data of this study are in good accordance with previously published behavioural response latencies.

  6. An evolutionary model for protein-coding regions with conserved RNA structure

    DEFF Research Database (Denmark)

    Pedersen, Jakob Skou; Forsberg, Roald; Meyer, Irmtraud Margret

    2004-01-01

    components of traditional phylogenetic models. We applied this to a data set of full-genome sequences from the hepatitis C virus where five RNA structures are mapped within the coding region. This allowed us to partition the effects of selection on different structural elements and to test various hypotheses...... concerning the relation of these effects. Of particular interest, we found evidence of a functional role of loop and bulge regions, as these were shown to evolve according to a different and more constrained selective regime than the nonpairing regions outside the RNA structures. Other potential applications...... of the model include comparative RNA structure prediction in coding regions and RNA virus phylogenetics....

  7. Residential buildings: Energy conservation (energy savings design code). Il Patrimonio residenziale pubblico

    Energy Technology Data Exchange (ETDEWEB)

    Los, S.; Pulizer, N.; Agnoletto, L.; Buggin, A.

    1991-01-01

    The energy savings design code presented in this paper was based on the energy performance of the basic types of residential buildings commonly found in Italy and the numerous combinations of energy savings measures which were hypothesized for them. The calculation algorithm carries out two distinct operations: the quantification of seasonal fuel consumption and the cost of proposed interventions. The code takes into account parameters defining: climatic data; building geometry, surface area, orientation (insolation, etc.); thermal insulation, including the thermal/physical material properties of the other construction materials; thermal comfort conditions; and the type and performance of conventional heating equipment components, including active and passive architectural systems and their relative control systems.

  8. HIGH-PERFORMANCE SIMPLE-ENCODING GENERATOR-BASED SYSTEMATIC IRREGULAR LDPC CODES AND RESULTED PRODUCT CODES

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Low-Density Parity-Check (LDPC) code is one of the most exciting topics among the coding theory community. It is of great importance in both theory and practical communications over noisy channels. The most advantage of LDPC codes is their relatively lower decoding complexity compared with turbo codes, while the disadvantage is its higher encoding complexity. In this paper, a new approach is first proposed to construct high performance irregular systematic LDPC codes based on sparse generator matrix, which can significantly reduce the encoding complexity under the same decoding complexity as that of regular or irregular LDPC codes defined by traditional sparse parity-check matrix. Then, the proposed generator-based systematic irregular LDPC codes are adopted ss constituent block codes in rows and columns to design a new kind of product codes family, which also can be interpreted as irregular LDPC codes characterized by graph and thus decoded iteratively. Finally,the performance of the generator-based LDPC codes and the resultant product codes is investigated over an Additive White Gaussian Noise (AWGN) and also compared with the conventional LDPC codes under the same conditions of decoding complexity and channel noise.

  9. Conservation agriculture in high tunnels: soil health and profit enhancement

    Science.gov (United States)

    In 2013, through the USDA’s Evans-Allen capacity grant, the high tunnel became an on-farm research laboratory for conservation agriculture. Dr. Manuel R. Reyes, Professor and his research team from the North Carolina Agriculture and Technology State University (NCATSU), Greensboro, North Carolina (1...

  10. Phylum-Level Conservation of Regulatory Information in Nematodes despite Extensive Non-coding Sequence Divergence

    Science.gov (United States)

    Gordon, Kacy L.; Arthur, Robert K.; Ruvinsky, Ilya

    2015-01-01

    Gene regulatory information guides development and shapes the course of evolution. To test conservation of gene regulation within the phylum Nematoda, we compared the functions of putative cis-regulatory sequences of four sets of orthologs (unc-47, unc-25, mec-3 and elt-2) from distantly-related nematode species. These species, Caenorhabditis elegans, its congeneric C. briggsae, and three parasitic species Meloidogyne hapla, Brugia malayi, and Trichinella spiralis, represent four of the five major clades in the phylum Nematoda. Despite the great phylogenetic distances sampled and the extensive sequence divergence of nematode genomes, all but one of the regulatory elements we tested are able to drive at least a subset of the expected gene expression patterns. We show that functionally conserved cis-regulatory elements have no more extended sequence similarity to their C. elegans orthologs than would be expected by chance, but they do harbor motifs that are important for proper expression of the C. elegans genes. These motifs are too short to be distinguished from the background level of sequence similarity, and while identical in sequence they are not conserved in orientation or position. Functional tests reveal that some of these motifs contribute to proper expression. Our results suggest that conserved regulatory circuitry can persist despite considerable turnover within cis elements. PMID:26020930

  11. Phylum-Level Conservation of Regulatory Information in Nematodes despite Extensive Non-coding Sequence Divergence.

    Directory of Open Access Journals (Sweden)

    Kacy L Gordon

    2015-05-01

    Full Text Available Gene regulatory information guides development and shapes the course of evolution. To test conservation of gene regulation within the phylum Nematoda, we compared the functions of putative cis-regulatory sequences of four sets of orthologs (unc-47, unc-25, mec-3 and elt-2 from distantly-related nematode species. These species, Caenorhabditis elegans, its congeneric C. briggsae, and three parasitic species Meloidogyne hapla, Brugia malayi, and Trichinella spiralis, represent four of the five major clades in the phylum Nematoda. Despite the great phylogenetic distances sampled and the extensive sequence divergence of nematode genomes, all but one of the regulatory elements we tested are able to drive at least a subset of the expected gene expression patterns. We show that functionally conserved cis-regulatory elements have no more extended sequence similarity to their C. elegans orthologs than would be expected by chance, but they do harbor motifs that are important for proper expression of the C. elegans genes. These motifs are too short to be distinguished from the background level of sequence similarity, and while identical in sequence they are not conserved in orientation or position. Functional tests reveal that some of these motifs contribute to proper expression. Our results suggest that conserved regulatory circuitry can persist despite considerable turnover within cis elements.

  12. Generation of monoclonal antibodies against highly conserved antigens.

    Directory of Open Access Journals (Sweden)

    Hongzhe Zhou

    Full Text Available BACKGROUND: Therapeutic antibody development is one of the fastest growing areas of the pharmaceutical industry. Generating high-quality monoclonal antibodies against a given therapeutic target is very crucial for the success of the drug development. However, due to immune tolerance, some proteins that are highly conserved between mice and humans are not very immunogenic in mice, making it difficult to generate antibodies using a conventional approach. METHODOLOGY/PRINCIPAL FINDINGS: In this report, the impaired immune tolerance of NZB/W mice was exploited to generate monoclonal antibodies against highly conserved or self-antigens. Using two highly conserved human antigens (MIF and HMGB1 and one mouse self-antigen (TNF-alpha as examples, we demonstrate here that multiple clones of high affinity, highly specific antibodies with desired biological activities can be generated, using the NZB/W mouse as the immunization host and a T cell-specific tag fused to a recombinant antigen to stimulate the immune system. CONCLUSIONS/SIGNIFICANCE: We developed an efficient and universal method for generating surrogate or therapeutic antibodies against "difficult antigens" to facilitate the development of therapeutic antibodies.

  13. Turbo-like codes design for high speed decoding

    CERN Document Server

    Abbasfar, Aliazam

    2007-01-01

    Turbo code concepts are explained in simple languageTurbo codes and LDPC codes are viewed in a unified manner as turbo-like codesImplementation and hardware complexity is a major focus Presents a novel class of powerful and practical turbo-like codes Includes advanced theoretical framework for professionals.

  14. Energy Conservation Tests of a Coupled Kinetic-kinetic Plasma-neutral Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Stotler, D. P.; Chang, C. S.; Ku, S. H.; Lang, J.; Park, G.

    2012-08-29

    A Monte Carlo neutral transport routine, based on DEGAS2, has been coupled to the guiding center ion-electron-neutral neoclassical PIC code XGC0 to provide a realistic treatment of neutral atoms and molecules in the tokamak edge plasma. The DEGAS2 routine allows detailed atomic physics and plasma-material interaction processes to be incorporated into these simulations. The spatial pro le of the neutral particle source used in the DEGAS2 routine is determined from the uxes of XGC0 ions to the material surfaces. The kinetic-kinetic plasma-neutral transport capability is demonstrated with example pedestal fueling simulations.

  15. High energy particle transport code NMTC/JAM

    Energy Technology Data Exchange (ETDEWEB)

    Niita, Koji [Research Organization for Information Science and Technology, Tokai, Ibaraki (Japan); Meigo, Shin-ichiro; Takada, Hiroshi; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    We have developed a high energy particle transport code NMTC/JAM, which is an upgraded version of NMTC/JAERI97. The applicable energy range of NMTC/JAM is extended in principle up to 200 GeV for nucleons and mesons by introducing the high energy nuclear reaction code JAM for the intra-nuclear cascade part. For the evaporation and fission process, we have also implemented a new model, GEM, by which the light nucleus production from the excited residual nucleus can be described. According to the extension of the applicable energy, we have upgraded the nucleon-nucleus non-elastic, elastic and differential elastic cross section data by employing new systematics. In addition, the particle transport in a magnetic field has been implemented for the beam transport calculations. In this upgrade, some new tally functions are added and the format of input of data has been improved very much in a user friendly manner. Due to the implementation of these new calculation functions and utilities, consequently, NMTC/JAM enables us to carry out reliable neutronics study of a large scale target system with complex geometry more accurately and easily than before. This report serves as a user manual of the code. (author)

  16. A high burnup model developed for the DIONISIO code

    Energy Technology Data Exchange (ETDEWEB)

    Soba, A. [U.A. Combustibles Nucleares, Comisión Nacional de Energía Atómica, Avenida del Libertador 8250, 1429 Buenos Aires (Argentina); Denis, A., E-mail: denis@cnea.gov.ar [U.A. Combustibles Nucleares, Comisión Nacional de Energía Atómica, Avenida del Libertador 8250, 1429 Buenos Aires (Argentina); Romero, L. [U.A. Reactores Nucleares, Comisión Nacional de Energía Atómica, Avenida del Libertador 8250, 1429 Buenos Aires (Argentina); Villarino, E.; Sardella, F. [Departamento Ingeniería Nuclear, INVAP SE, Comandante Luis Piedra Buena 4950, 8430 San Carlos de Bariloche, Río Negro (Argentina)

    2013-02-15

    A group of subroutines, designed to extend the application range of the fuel performance code DIONISIO to high burn up, has recently been included in the code. The new calculation tools, which are tuned for UO{sub 2} fuels in LWR conditions, predict the radial distribution of power density, burnup, and concentration of diverse nuclides within the pellet. The balance equations of all the isotopes involved in the fission process are solved in a simplified manner, and the one-group effective cross sections of all of them are obtained as functions of the radial position in the pellet, burnup, and enrichment in {sup 235}U. In this work, the subroutines are described and the results of the simulations performed with DIONISIO are presented. The good agreement with the data provided in the FUMEX II/III NEA data bank can be easily recognized.

  17. A high burnup model developed for the DIONISIO code

    Science.gov (United States)

    Soba, A.; Denis, A.; Romero, L.; Villarino, E.; Sardella, F.

    2013-02-01

    A group of subroutines, designed to extend the application range of the fuel performance code DIONISIO to high burn up, has recently been included in the code. The new calculation tools, which are tuned for UO2 fuels in LWR conditions, predict the radial distribution of power density, burnup, and concentration of diverse nuclides within the pellet. The balance equations of all the isotopes involved in the fission process are solved in a simplified manner, and the one-group effective cross sections of all of them are obtained as functions of the radial position in the pellet, burnup, and enrichment in 235U. In this work, the subroutines are described and the results of the simulations performed with DIONISIO are presented. The good agreement with the data provided in the FUMEX II/III NEA data bank can be easily recognized.

  18. High explosive programmed burn in the FLAG code

    Energy Technology Data Exchange (ETDEWEB)

    Mandell, D.; Burton, D.; Lund, C.

    1998-02-01

    The models used to calculate the programmed burn high-explosive lighting times for two- and three-dimensions in the FLAG code are described. FLAG uses an unstructured polyhedra grid. The calculations were compared to exact solutions for a square in two dimensions and for a cube in three dimensions. The maximum error was 3.95 percent in two dimensions and 4.84 percent in three dimensions. The high explosive lighting time model described has the advantage that only one cell at a time needs to be considered.

  19. High Temperature Gas Reactors: Assessment of Applicable Codes and Standards

    Energy Technology Data Exchange (ETDEWEB)

    McDowell, Bruce K.; Nickolaus, James R.; Mitchell, Mark R.; Swearingen, Gary L.; Pugh, Ray

    2011-10-31

    Current interest expressed by industry in HTGR plants, particularly modular plants with power up to about 600 MW(e) per unit, has prompted NRC to task PNNL with assessing the currently available literature related to codes and standards applicable to HTGR plants, the operating history of past and present HTGR plants, and with evaluating the proposed designs of RPV and associated piping for future plants. Considering these topics in the order they are arranged in the text, first the operational histories of five shut-down and two currently operating HTGR plants are reviewed, leading the authors to conclude that while small, simple prototype HTGR plants operated reliably, some of the larger plants, particularly Fort St. Vrain, had poor availability. Safety and radiological performance of these plants has been considerably better than LWR plants. Petroleum processing plants provide some applicable experience with materials similar to those proposed for HTGR piping and vessels. At least one currently operating plant - HTR-10 - has performed and documented a leak before break analysis that appears to be applicable to proposed future US HTGR designs. Current codes and standards cover some HTGR materials, but not all materials are covered to the high temperatures envisioned for HTGR use. Codes and standards, particularly ASME Codes, are under development for proposed future US HTGR designs. A 'roadmap' document has been prepared for ASME Code development; a new subsection to section III of the ASME Code, ASME BPVC III-5, is scheduled to be published in October 2011. The question of terminology for the cross-duct structure between the RPV and power conversion vessel is discussed, considering the differences in regulatory requirements that apply depending on whether this structure is designated as a 'vessel' or as a 'pipe'. We conclude that designing this component as a 'pipe' is the more appropriate choice, but that the ASME BPVC

  20. The Number, Organization, and Size of Polymorphic Membrane Protein Coding Sequences as well as the Most Conserved Pmp Protein Differ within and across Chlamydia Species.

    Science.gov (United States)

    Van Lent, Sarah; Creasy, Heather Huot; Myers, Garry S A; Vanrompay, Daisy

    2016-01-01

    Variation is a central trait of the polymorphic membrane protein (Pmp) family. The number of pmp coding sequences differs between Chlamydia species, but it is unknown whether the number of pmp coding sequences is constant within a Chlamydia species. The level of conservation of the Pmp proteins has previously only been determined for Chlamydia trachomatis. As different Pmp proteins might be indispensible for the pathogenesis of different Chlamydia species, this study investigated the conservation of Pmp proteins both within and across C. trachomatis,C. pneumoniae,C. abortus, and C. psittaci. The pmp coding sequences were annotated in 16 C. trachomatis, 6 C. pneumoniae, 2 C. abortus, and 16 C. psittaci genomes. The number and organization of polymorphic membrane coding sequences differed within and across the analyzed Chlamydia species. The length of coding sequences of pmpA,pmpB, and pmpH was conserved among all analyzed genomes, while the length of pmpE/F and pmpG, and remarkably also of the subtype pmpD, differed among the analyzed genomes. PmpD, PmpA, PmpH, and PmpA were the most conserved Pmp in C. trachomatis,C. pneumoniae,C. abortus, and C. psittaci, respectively. PmpB was the most conserved Pmp across the 4 analyzed Chlamydia species.

  1. Structure-aided prediction of mammalian transcription factor complexes in conserved non-coding elements

    KAUST Repository

    Guturu, H.

    2013-11-11

    Mapping the DNA-binding preferences of transcription factor (TF) complexes is critical for deciphering the functions of cis-regulatory elements. Here, we developed a computational method that compares co-occurring motif spacings in conserved versus unconserved regions of the human genome to detect evolutionarily constrained binding sites of rigid TF complexes. Structural data were used to estimate TF complex physical plausibility, explore overlapping motif arrangements seldom tackled by non-structure-aware methods, and generate and analyse three-dimensional models of the predicted complexes bound to DNA. Using this approach, we predicted 422 physically realistic TF complex motifs at 18% false discovery rate, the majority of which (326, 77%) contain some sequence overlap between binding sites. The set of mostly novel complexes is enriched in known composite motifs, predictive of binding site configurations in TF-TF-DNA crystal structures, and supported by ChIP-seq datasets. Structural modelling revealed three cooperativity mechanisms: direct protein-protein interactions, potentially indirect interactions and \\'through-DNA\\' interactions. Indeed, 38% of the predicted complexes were found to contain four or more bases in which TF pairs appear to synergize through overlapping binding to the same DNA base pairs in opposite grooves or strands. Our TF complex and associated binding site predictions are available as a web resource at http://bejerano.stanford.edu/complex.

  2. ABCE1 is a highly conserved RNA silencing suppressor.

    Directory of Open Access Journals (Sweden)

    Kairi Kärblane

    Full Text Available ATP-binding cassette sub-family E member 1 (ABCE1 is a highly conserved protein among eukaryotes and archaea. Recent studies have identified ABCE1 as a ribosome-recycling factor important for translation termination in mammalian cells, yeast and also archaea. Here we report another conserved function of ABCE1. We have previously described AtRLI2, the homolog of ABCE1 in the plant Arabidopsis thaliana, as an endogenous suppressor of RNA silencing. In this study we show that this function is conserved: human ABCE1 is able to suppress RNA silencing in Nicotiana benthamiana plants, in mammalian HEK293 cells and in the worm Caenorhabditis elegans. Using co-immunoprecipitation and mass spectrometry, we found a number of potential ABCE1-interacting proteins that might support its function as an endogenous suppressor of RNA interference. The interactor candidates are associated with epigenetic regulation, transcription, RNA processing and mRNA surveillance. In addition, one of the identified proteins is translin, which together with its binding partner TRAX supports RNA interference.

  3. Evolutionarily divergent spliceosomal snRNAs and a conserved non-coding RNA processing motif in Giardia lamblia

    Science.gov (United States)

    Hudson, Andrew J.; Moore, Ashley N.; Elniski, David; Joseph, Joella; Yee, Janet; Russell, Anthony G.

    2012-01-01

    Non-coding RNAs (ncRNAs) have diverse essential biological functions in all organisms, and in eukaryotes, two such classes of ncRNAs are the small nucleolar (sno) and small nuclear (sn) RNAs. In this study, we have identified and characterized a collection of sno and snRNAs in Giardia lamblia, by exploiting our discovery of a conserved 12 nt RNA processing sequence motif found in the 3′ end regions of a large number of G. lamblia ncRNA genes. RNA end mapping and other experiments indicate the motif serves to mediate ncRNA 3′ end formation from mono- and di-cistronic RNA precursor transcripts. Remarkably, we find the motif is also utilized in the processing pathway of all four previously identified trans-spliced G. lamblia introns, revealing a common RNA processing pathway for ncRNAs and trans-spliced introns in this organism. Motif sequence conservation then allowed for the bioinformatic and experimental identification of additional G. lamblia ncRNAs, including new U1 and U6 spliceosomal snRNA candidates. The U6 snRNA candidate was then used as a tool to identity novel U2 and U4 snRNAs, based on predicted phylogenetically conserved snRNA–snRNA base-pairing interactions, from a set of previously identified G. lamblia ncRNAs without assigned function. The Giardia snRNAs retain the core features of spliceosomal snRNAs but are sufficiently evolutionarily divergent to explain the difficulties in their identification. Most intriguingly, all of these snRNAs show structural features diagnostic of U2-dependent/major and U12-dependent/minor spliceosomal snRNAs. PMID:23019220

  4. A watermarking scheme for High Efficiency Video Coding (HEVC).

    Science.gov (United States)

    Swati, Salahuddin; Hayat, Khizar; Shahid, Zafar

    2014-01-01

    This paper presents a high payload watermarking scheme for High Efficiency Video Coding (HEVC). HEVC is an emerging video compression standard that provides better compression performance as compared to its predecessor, i.e. H.264/AVC. Considering that HEVC may will be used in a variety of applications in the future, the proposed algorithm has a high potential of utilization in applications involving broadcast and hiding of metadata. The watermark is embedded into the Quantized Transform Coefficients (QTCs) during the encoding process. Later, during the decoding process, the embedded message can be detected and extracted completely. The experimental results show that the proposed algorithm does not significantly affect the video quality, nor does it escalate the bitrate.

  5. High performance single-error-correcting quantum codes for amplitude damping

    CERN Document Server

    Shor, Peter W; Smolin, John A; Zeng, Bei

    2009-01-01

    We construct families of high performance quantum amplitude damping codes. All of our codes are nonadditive and most modestly outperform the best possible additive codes in terms of encoded dimension. One family is built from nonlinear error-correcting codes for classical asymmetric channels, with which we systematically construct quantum amplitude damping codes with parameters better than any prior construction known for any block length n > 7 except n=2^r-1. We generalize this construction to employ classical codes over GF(3) with which we numerically obtain better performing codes up to length 14. Because the resulting codes are of the codeword stabilized (CWS) type, easy encoding and decoding circuits are available.

  6. High efficiency video coding (HEVC) algorithms and architectures

    CERN Document Server

    Budagavi, Madhukar; Sullivan, Gary

    2014-01-01

    This book provides developers, engineers, researchers and students with detailed knowledge about the High Efficiency Video Coding (HEVC) standard. HEVC is the successor to the widely successful H.264/AVC video compression standard, and it provides around twice as much compression as H.264/AVC for the same level of quality. The applications for HEVC will not only cover the space of the well-known current uses and capabilities of digital video – they will also include the deployment of new services and the delivery of enhanced video quality, such as ultra-high-definition television (UHDTV) and video with higher dynamic range, wider range of representable color, and greater representation precision than what is typically found today. HEVC is the next major generation of video coding design – a flexible, reliable and robust solution that will support the next decade of video applications and ease the burden of video on world-wide network traffic. This book provides a detailed explanation of the various parts ...

  7. A highly conserved pericentromeric domain in human and gorilla chromosomes.

    Science.gov (United States)

    Pita, M; Gosálvez, J; Gosálvez, A; Nieddu, M; López-Fernández, C; Mezzanotte, R

    2009-01-01

    Significant similarity between human and gorilla genomes has been found in all chromosome arms, but not in centromeres, using whole-comparative genomic hybridization (W-CGH). In human chromosomes, centromeric regions, generally containing highly repetitive DNAs, are characterized by the presence of specific human DNA sequences and an absence of homology with gorilla DNA sequences. The only exception is the pericentromeric area of human chromosome 9, which, in addition to a large block of human DNA, also contains a region of homology with gorilla DNA sequences; the localization of these sequences coincides with that of human satellite III. Since highly repetitive DNAs are known for their high mutation frequency, we hypothesized that the chromosome 9 pericentromeric DNA conserved in human chromosomes and deriving from the gorilla genome may thus play some important functional role.

  8. Antibody Recognition of a Highly Conserved Influenza Virus Epitope

    Energy Technology Data Exchange (ETDEWEB)

    Ekiert, Damian C.; Bhabha, Gira; Elsliger, Marc-André; Friesen, Robert H.E.; Jongeneelen, Mandy; Throsby, Mark; Goudsmit, Jaap; Wilson, Ian A.; Scripps; Crucell

    2009-05-21

    Influenza virus presents an important and persistent threat to public health worldwide, and current vaccines provide immunity to viral isolates similar to the vaccine strain. High-affinity antibodies against a conserved epitope could provide immunity to the diverse influenza subtypes and protection against future pandemic viruses. Cocrystal structures were determined at 2.2 and 2.7 angstrom resolutions for broadly neutralizing human antibody CR6261 Fab in complexes with the major surface antigen (hemagglutinin, HA) from viruses responsible for the 1918 H1N1 influenza pandemic and a recent lethal case of H5N1 avian influenza. In contrast to other structurally characterized influenza antibodies, CR6261 recognizes a highly conserved helical region in the membrane-proximal stem of HA1 and HA2. The antibody neutralizes the virus by blocking conformational rearrangements associated with membrane fusion. The CR6261 epitope identified here should accelerate the design and implementation of improved vaccines that can elicit CR6261-like antibodies, as well as antibody-based therapies for the treatment of influenza.

  9. Taking High Conservation Value from Forests to Freshwaters

    Science.gov (United States)

    Abell, Robin; Morgan, Siân K.; Morgan, Alexis J.

    2015-07-01

    The high conservation value (HCV) concept, originally developed by the Forest Stewardship Council, has been widely incorporated outside the forestry sector into companies' supply chain assessments and responsible purchasing policies, financial institutions' investment policies, and numerous voluntary commodity standards. Many, if not most, of these newer applications relate to production practices that are likely to affect freshwater systems directly or indirectly, yet there is little guidance as to whether or how HCV can be applied to water bodies. We focus this paper on commodity standards and begin by exploring how prominent standards currently address both HCVs and freshwaters. We then highlight freshwater features of high conservation importance and examine how well those features are captured by the existing HCV framework. We propose a new set of freshwater `elements' for each of the six values and suggest an approach for identifying HCV Areas that takes out-of-fence line impacts into account, thereby spatially extending the scope of existing methods to define HCVs. We argue that virtually any non-marine HCV assessment, regardless of the production sector, should be expanded to include freshwater values, and we suggest how to put those recommendations into practice.

  10. A high-speed BCI based on code modulation VEP

    Science.gov (United States)

    Bin, Guangyu; Gao, Xiaorong; Wang, Yijun; Li, Yun; Hong, Bo; Gao, Shangkai

    2011-04-01

    Recently, electroencephalogram-based brain-computer interfaces (BCIs) have attracted much attention in the fields of neural engineering and rehabilitation due to their noninvasiveness. However, the low communication speed of current BCI systems greatly limits their practical application. In this paper, we present a high-speed BCI based on code modulation of visual evoked potentials (c-VEP). Thirty-two target stimuli were modulated by a time-shifted binary pseudorandom sequence. A multichannel identification method based on canonical correlation analysis (CCA) was used for target identification. The online system achieved an average information transfer rate (ITR) of 108 ± 12 bits min-1 on five subjects with a maximum ITR of 123 bits min-1 for a single subject.

  11. Dynamic Epigenetic Control of Highly Conserved Noncoding Elements

    KAUST Repository

    Seridi, Loqmane

    2014-10-07

    Background Many noncoding genomic loci have remained constant over long evolutionary periods, suggesting that they are exposed to strong selective pressures. The molecular functions of these elements have been partially elucidated, but the fundamental reason for their extreme conservation is still unknown. Results To gain new insights into the extreme selection of highly conserved noncoding elements (HCNEs), we used a systematic analysis of multi-omic data to study the epigenetic regulation of such elements during the development of Drosophila melanogaster. At the sequence level, HCNEs are GC-rich and have a characteristic oligomeric composition. They have higher levels of stable nucleosome occupancy than their flanking regions, and lower levels of mononucleosomes and H3.3, suggesting that these regions reside in compact chromatin. Furthermore, these regions showed remarkable modulations in histone modification and the expression levels of adjacent genes during development. Although HCNEs are primarily initiated late in replication, about 10% were related to early replication origins. Finally, HCNEs showed strong enrichment within lamina-associated domains. Conclusion HCNEs have distinct and protective sequence properties, undergo dynamic epigenetic regulation, and appear to be associated with the structural components of the chromatin, replication origins, and nuclear matrix. These observations indicate that such elements are likely to have essential cellular functions, and offer insights into their epigenetic properties.

  12. 7 CFR 760.821 - Compliance with highly erodible land and wetland conservation.

    Science.gov (United States)

    2010-01-01

    ... Disaster Program § 760.821 Compliance with highly erodible land and wetland conservation. (a) The highly erodible land and wetland conservation provisions of part 12 of this title apply to the receipt of disaster... participants must be in compliance with the highly erodible land and wetland conservation compliance...

  13. Highly Optimized Code Generation for Stencil Codes with Computation Reuse for GPUs

    Institute of Scientific and Technical Information of China (English)

    Wen-Jing Ma; Kan Gao; Guo-Ping Long

    2016-01-01

    Computation reuse is known as an effective optimization technique. However, due to the complexity of modern GPU architectures, there is yet not enough understanding regarding the intriguing implications of the interplay of compu-tation reuse and hardware specifics on application performance. In this paper, we propose an automatic code generator for a class of stencil codes with inherent computation reuse on GPUs. For such applications, the proper reuse of intermediate results, combined with careful register and on-chip local memory usage, has profound implications on performance. Current state of the art does not address this problem in depth, partially due to the lack of a good program representation that can expose all potential computation reuse. In this paper, we leverage the computation overlap graph (COG), a simple representation of data dependence and data reuse with “element view”, to expose potential reuse opportunities. Using COG, we propose a portable code generation and tuning framework for GPUs. Compared with current state-of-the-art code generators, our experimental results show up to 56.7%performance improvement on modern GPUs such as NVIDIA C2050.

  14. Differences in evolutionary pressure acting within highly conserved ortholog groups

    Directory of Open Access Journals (Sweden)

    Aravind L

    2008-07-01

    Full Text Available Abstract Background In highly conserved widely distributed ortholog groups, the main evolutionary force is assumed to be purifying selection that enforces sequence conservation, with most divergence occurring by accumulation of neutral substitutions. Using a set of ortholog groups from prokaryotes, with a single representative in each studied organism, we asked the question if this evolutionary pressure is acting similarly on different subgroups of orthologs defined as major lineages (e.g. Proteobacteria or Firmicutes. Results Using correlations in entropy measures as a proxy for evolutionary pressure, we observed two distinct behaviors within our ortholog collection. The first subset of ortholog groups, called here informational, consisted mostly of proteins associated with information processing (i.e. translation, transcription, DNA replication and the second, the non-informational ortholog groups, mostly comprised of proteins involved in metabolic pathways. The evolutionary pressure acting on non-informational proteins is more uniform relative to their informational counterparts. The non-informational proteins show higher level of correlation between entropy profiles and more uniformity across subgroups. Conclusion The low correlation of entropy profiles in the informational ortholog groups suggest that the evolutionary pressure acting on the informational ortholog groups is not uniform across different clades considered this study. This might suggest "fine-tuning" of informational proteins in each lineage leading to lineage-specific differences in selection. This, in turn, could make these proteins less exchangeable between lineages. In contrast, the uniformity of the selective pressure acting on the non-informational groups might allow the exchange of the genetic material via lateral gene transfer.

  15. Similarities between Students Receiving Dress Code Violations and Discipline Referrals at Newport Junior High School

    Science.gov (United States)

    Nicholson, Nikki

    2007-01-01

    Background: Looking at dress code violations and demographics surrounding kids breaking the rules. Purpose: To see if there is a connection between dress code violations and discipline referrals. Setting: Jr. High School; Study Sample: Students with dress code violations for one week; Intervention: N/A; Research Design: Correlational; and Control…

  16. High-Order Space-Time Methods for Conservation Laws

    Science.gov (United States)

    Huynh, H. T.

    2013-01-01

    Current high-order methods such as discontinuous Galerkin and/or flux reconstruction can provide effective discretization for the spatial derivatives. Together with a time discretization, such methods result in either too small a time step size in the case of an explicit scheme or a very large system in the case of an implicit one. To tackle these problems, two new high-order space-time schemes for conservation laws are introduced: the first is explicit and the second, implicit. The explicit method here, also called the moment scheme, achieves a Courant-Friedrichs-Lewy (CFL) condition of 1 for the case of one-spatial dimension regardless of the degree of the polynomial approximation. (For standard explicit methods, if the spatial approximation is of degree p, then the time step sizes are typically proportional to 1/p(exp 2)). Fourier analyses for the one and two-dimensional cases are carried out. The property of super accuracy (or super convergence) is discussed. The implicit method is a simplified but optimal version of the discontinuous Galerkin scheme applied to time. It reduces to a collocation implicit Runge-Kutta (RK) method for ordinary differential equations (ODE) called Radau IIA. The explicit and implicit schemes are closely related since they employ the same intermediate time levels, and the former can serve as a key building block in an iterative procedure for the latter. A limiting technique for the piecewise linear scheme is also discussed. The technique can suppress oscillations near a discontinuity while preserving accuracy near extrema. Preliminary numerical results are shown

  17. Benchmark of Different Electromagnetic Codes for the High Frequency Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Kai Tian, Haipeng Wang, Frank Marhauser, Guangfeng Cheng, Chuandong Zhou

    2009-05-01

    In this paper, we present benchmarking results for highclass 3D electromagnetic (EM) codes in designing RF cavities today. These codes include Omega3P [1], VORPAL [2], CST Microwave Studio [3], Ansoft HFSS [4], and ANSYS [5]. Two spherical cavities are selected as the benchmark models. We have compared not only the accuracy of resonant frequencies, but also that of surface EM fields, which are critical for superconducting RF cavities. By removing degenerated modes, we calculate all the resonant modes up to 10 GHz with similar mesh densities, so that the geometry approximation and field interpolation error related to the wavelength can be observed.

  18. Conservation laws of high-order nonlinear PDEs and the variational conservation laws in the class with mixed derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Narain, R; Kara, A H, E-mail: Abdul.Kara@wits.ac.z [School of Mathematics, University of the Witwatersrand, Wits 2050, Johannesburg (South Africa)

    2010-02-26

    The construction of conserved vectors using Noether's theorem via a knowledge of a Lagrangian (or via the recently developed concept of partial Lagrangians) is well known. The formulas to determine these for higher order flows are somewhat cumbersome but peculiar and become more so as the order increases. We carry out these for a class of high-order partial differential equations from mathematical physics and then consider some specific ones with mixed derivatives. In the latter set of examples, our main focus is that the resultant conserved flows display some previously unknown interesting 'divergence properties' owing to the presence of the mixed derivatives. Overall, we consider a large class of equations of interest and construct some new conservation laws.

  19. Context-adaptive binary arithmetic coding with precise probability estimation and complexity scalability for high-efficiency video coding

    Science.gov (United States)

    Karwowski, Damian; Domański, Marek

    2016-01-01

    An improved context-based adaptive binary arithmetic coding (CABAC) is presented. The idea for the improvement is to use a more accurate mechanism for estimation of symbol probabilities in the standard CABAC algorithm. The authors' proposal of such a mechanism is based on the context-tree weighting technique. In the framework of a high-efficiency video coding (HEVC) video encoder, the improved CABAC allows 0.7% to 4.5% bitrate saving compared to the original CABAC algorithm. The application of the proposed algorithm marginally affects the complexity of HEVC video encoder, but the complexity of video decoder increases by 32% to 38%. In order to decrease the complexity of video decoding, a new tool has been proposed for the improved CABAC that enables scaling of the decoder complexity. Experiments show that this tool gives 5% to 7.5% reduction of the decoding time while still maintaining high efficiency in the data compression.

  20. One Packet Suffices - Highly Efficient Packetized Network Coding With Finite Memory

    CERN Document Server

    Haeupler, Bernhard

    2011-01-01

    Random Linear Network Coding (RLNC) has emerged as a powerful tool for robust high-throughput multicast. Projection analysis - a recently introduced technique - shows that the distributed packetized RLNC protocol achieves (order) optimal and perfectly pipelined information dissemination in many settings. In the original approach to RNLC intermediate nodes code together all available information. This requires intermediate nodes to keep considerable data available for coding. Moreover, it results in a coding complexity that grows linearly with the size of this data. While this has been identified as a problem, approaches that combine queuing theory and network coding have heretofore not provided a succinct representation of the memory needs of network coding at intermediates nodes. This paper shows the surprising result that, in all settings with a continuous stream of data, network coding continues to perform optimally even if only one packet per node is kept in active memory and used for computations. This l...

  1. Electromagnetic PIC simulation with highly enhanced energy conservation

    CERN Document Server

    Yazdanpanah, J

    2011-01-01

    We have obtained an electromagnetic PIC (EM-PIC) algorithm based on time-space-extended particle in cell model. In this model particles are shaped objects extended over time and space around Lagrangian markers. Sources carried by these particles are weighted completely into centers and faces of time-space cells of simulation-domain. Weighting is obtained by implication of conservation of charge of shaped particles. By solving Maxwell's equations over source free zones of simulation grid we reduce solution of these equations to finding field values at nods of this grid. Major source of error in this model (and albeit other PIC models) is identified to be mismatching of particle marker location and location of its assigned sources in time and space. Relation of leapfrog scheme for integration of equations of motion with this discrepancy is investigated by evaluation of violation of energy conservation. We come in conclusion that instead of leapfrog we should integrate equations of motion simultaneously. Though ...

  2. Hydroxylation of a conserved tRNA modification establishes non-universal genetic code in echinoderm mitochondria.

    Science.gov (United States)

    Nagao, Asuteka; Ohara, Mitsuhiro; Miyauchi, Kenjyo; Yokobori, Shin-Ichi; Yamagishi, Akihiko; Watanabe, Kimitsuna; Suzuki, Tsutomu

    2017-09-01

    The genetic code is not frozen but still evolving, which can result in the acquisition of 'dialectal' codons that deviate from the universal genetic code. RNA modifications in the anticodon region of tRNAs play a critical role in establishing such non-universal genetic codes. In echinoderm mitochondria, the AAA codon specifies asparagine instead of lysine. By analyzing mitochondrial (mt-) tRNA(Lys) isolated from the sea urchin (Mesocentrotus nudus), we discovered a novel modified nucleoside, hydroxy-N(6)-threonylcarbamoyladenosine (ht(6)A), 3' adjacent to the anticodon (position 37). Biochemical analysis revealed that ht(6)A37 has the ability to prevent mt-tRNA(Lys) from misreading AAA as lysine, thereby indicating that hydroxylation of N(6)-threonylcarbamoyladenosine (t(6)A) contributes to the establishment of the non-universal genetic code in echinoderm mitochondria.

  3. Rewriting the Genetic Code.

    Science.gov (United States)

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  4. Thermal-hydraulic code selection for modular high temperature gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Komen, E.M.J.; Bogaard, J.P.A. van den

    1995-06-01

    In order to study the transient thermal-hydraulic system behaviour of modular high temperature gas-cooled reactors, the thermal-hydraulic computer codes RELAP5, MELCOR, THATCH, MORECA, and VSOP are considered at the Netherlands Energy Research Foundation ECN. This report presents the selection of the most appropriate codes. To cover the range of relevant accidents, a suite of three codes is recommended for analyses of HTR-M and MHTGR reactors. (orig.).

  5. Facile and High-Throughput Synthesis of Functional Microparticles with Quick Response Codes.

    Science.gov (United States)

    Ramirez, Lisa Marie S; He, Muhan; Mailloux, Shay; George, Justin; Wang, Jun

    2016-06-01

    Encoded microparticles are high demand in multiplexed assays and labeling. However, the current methods for the synthesis and coding of microparticles either lack robustness and reliability, or possess limited coding capacity. Here, a massive coding of dissociated elements (MiCODE) technology based on innovation of a chemically reactive off-stoichimetry thiol-allyl photocurable polymer and standard lithography to produce a large number of quick response (QR) code microparticles is introduced. The coding process is performed by photobleaching the QR code patterns on microparticles when fluorophores are incorporated into the prepolymer formulation. The fabricated encoded microparticles can be released from a substrate without changing their features. Excess thiol functionality on the microparticle surface allows for grafting of amine groups and further DNA probes. A multiplexed assay is demonstrated using the DNA-grafted QR code microparticles. The MiCODE technology is further characterized by showing the incorporation of BODIPY-maleimide (BDP-M) and Nile Red fluorophores for coding and the use of microcontact printing for immobilizing DNA probes on microparticle surfaces. This versatile technology leverages mature lithography facilities for fabrication and thus is amenable to scale-up in the future, with potential applications in bioassays and in labeling consumer products.

  6. High-Speed Turbo-TCM-Coded Orthogonal Frequency-Division Multiplexing Ultra-Wideband Systems

    Directory of Open Access Journals (Sweden)

    2006-01-01

    Full Text Available One of the UWB proposals in the IEEE P802.15 WPAN project is to use a multiband orthogonal frequency-division multiplexing (OFDM system and punctured convolutional codes for UWB channels supporting a data rate up to 480 Mbps. In this paper, we improve the proposed system using turbo TCM with QAM constellation for higher data rate transmission. We construct a punctured parity-concatenated trellis codes, in which a TCM code is used as the inner code and a simple parity-check code is employed as the outer code. The result shows that the system can offer a much higher spectral efficiency, for example, 1.2 Gbps, which is 2.5 times higher than the proposed system. We identify several essential requirements to achieve the high rate transmission, for example, frequency and time diversity and multilevel error protection. Results are confirmed by density evolution.

  7. Low Complexity Encoder of High Rate Irregular QC-LDPC Codes for Partial Response Channels

    Directory of Open Access Journals (Sweden)

    IMTAWIL, V.

    2011-11-01

    Full Text Available High rate irregular QC-LDPC codes based on circulant permutation matrices, for efficient encoder implementation, are proposed in this article. The structure of the code is an approximate lower triangular matrix. In addition, we present two novel efficient encoding techniques for generating redundant bits. The complexity of the encoder implementation depends on the number of parity bits of the code for the one-stage encoding and the length of the code for the two-stage encoding. The advantage of both encoding techniques is that few XOR-gates are used in the encoder implementation. Simulation results on partial response channels also show that the BER performance of the proposed code has gain over other QC-LDPC codes.

  8. High-Speed Turbo-TCM-Coded Orthogonal Frequency-Division Multiplexing Ultra-Wideband Systems

    Directory of Open Access Journals (Sweden)

    Wang Yanxia

    2006-01-01

    Full Text Available One of the UWB proposals in the IEEE P802.15 WPAN project is to use a multiband orthogonal frequency-division multiplexing (OFDM system and punctured convolutional codes for UWB channels supporting a data rate up to 480 Mbps. In this paper, we improve the proposed system using turbo TCM with QAM constellation for higher data rate transmission. We construct a punctured parity-concatenated trellis codes, in which a TCM code is used as the inner code and a simple parity-check code is employed as the outer code. The result shows that the system can offer a much higher spectral efficiency, for example, 1.2 Gbps, which is 2.5 times higher than the proposed system. We identify several essential requirements to achieve the high rate transmission, for example, frequency and time diversity and multilevel error protection. Results are confirmed by density evolution.

  9. Assessment of selected conservation measures for high-temperature process industries

    Energy Technology Data Exchange (ETDEWEB)

    Kusik, C L; Parameswaran, K; Nadkarni, R; O& #x27; Neill, J K; Malhotra, S; Hyde, R; Kinneberg, D; Fox, L; Rossetti, M

    1981-01-01

    Energy conservation projects involving high-temperature processes in various stages of development are assessed to quantify their energy conservation potential; to determine their present status of development; to identify their research and development needs and estimate the associated costs; and to determine the most effective role for the Federal government in developing these technologies. The program analyzed 25 energy conserving processes in the iron and steel, aluminium, copper, magnesium, cement, and glassmaking industries. A preliminary list of other potential energy conservation projects in these industries is also presented in the appendix. (MCW)

  10. Assessment of selected conservation measures for high-temperature process industries

    Energy Technology Data Exchange (ETDEWEB)

    Kusik, C L; Parameswaran, K; Nadkarni, R; O& #x27; Neill, J K; Malhotra, S; Hyde, R; Kinneberg, D; Fox, L; Rossetti, M

    1981-01-01

    Energy conservation projects involving high-temperature processes in various stages of development are assessed to quantify their energy conservation potential; to determine their present status of development; to identify their research and development needs and estimate the associated costs; and to determine the most effective role for the Federal government in developing these technologies. The program analyzed 25 energy conserving processes in the iron and steel, aluminium, copper, magnesium, cement, and glassmaking industries. A preliminary list of other potential energy conservation projects in these industries is also presented in the appendix. (MCW)

  11. 7 CFR 1430.225 - Violations of highly erodible land and wetland conservation provisions.

    Science.gov (United States)

    2010-01-01

    ... wetland conservation provisions. The provisions of part 12 of this title apply to this part. ... 7 Agriculture 10 2010-01-01 2010-01-01 false Violations of highly erodible land and wetland conservation provisions. 1430.225 Section 1430.225 Agriculture Regulations of the Department of...

  12. 7 CFR 1412.68 - Compliance with highly erodible land and wetland conservation provisions.

    Science.gov (United States)

    2010-01-01

    ... and wetland conservation provisions. The provisions of part 12 of this title apply to this part. ... 7 Agriculture 10 2010-01-01 2010-01-01 false Compliance with highly erodible land and wetland conservation provisions. 1412.68 Section 1412.68 Agriculture Regulations of the Department of...

  13. The human HNRPD locus maps to 4q21 and encodes a highly conserved protein.

    Science.gov (United States)

    Dempsey, L A; Li, M J; DePace, A; Bray-Ward, P; Maizels, N

    1998-05-01

    The hnRNP D protein interacts with nucleic acids both in vivo and in vitro. Like many other proteins that interact with RNA, it contains RBD (or "RRM") domains and arg-gly-gly (RGG) motifs. We have examined the organization and localization of the human and murine genes that encode the hnRNP D protein. Comparison of the predicted sequences of the hnRNP D proteins in human and mouse shows that they are 96.9% identical (98.9% similar). This very high level of conservation suggests a critical function for hnRNP D. Sequence analysis of the human HNRPD gene shows that the protein is encoded by eight exons and that two additional exons specify sequences in the 3' UTR. Use of two of the coding exons is determined by alternative splicing of the HNRPD mRNA. The human HNRPD gene maps to 4q21. The mouse Hnrpd gene maps to the F region of chromosome 3, which is syntenic with the human 4q21 region.

  14. Least Reliable Bits Coding (LRBC) for high data rate satellite communications

    Science.gov (United States)

    Vanderaar, Mark; Wagner, Paul; Budinger, James

    1992-02-01

    An analysis and discussion of a bandwidth efficient multi-level/multi-stage block coded modulation technique called Least Reliable Bits Coding (LRBC) is presented. LRBC uses simple multi-level component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Further, soft-decision multi-stage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Using analytical expressions and tight performance bounds it is shown that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of Binary Phase Shift Keying (BPSK). Bit error rates (BER) vs. channel bit energy with Additive White Gaussian Noise (AWGN) are given for a set of LRB Reed-Solomon (RS) encoded 8PSK modulation formats with an ensemble rate of 8/9. All formats exhibit a spectral efficiency of 2.67 = (log2(8))(8/9) information bps/Hz. Bit by bit coded and uncoded error probabilities with soft-decision information are determined. These are traded with with code rate to determine parameters that achieve good performance. The relative simplicity of Galois field algebra vs. the Viterbi algorithm and the availability of high speed commercial Very Large Scale Integration (VLSI) for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.

  15. Telomeric expression sites are highly conserved in Trypanosoma brucei.

    Directory of Open Access Journals (Sweden)

    Christiane Hertz-Fowler

    Full Text Available Subtelomeric regions are often under-represented in genome sequences of eukaryotes. One of the best known examples of the use of telomere proximity for adaptive purposes are the bloodstream expression sites (BESs of the African trypanosome Trypanosoma brucei. To enhance our understanding of BES structure and function in host adaptation and immune evasion, the BES repertoire from the Lister 427 strain of T. brucei were independently tagged and sequenced. BESs are polymorphic in size and structure but reveal a surprisingly conserved architecture in the context of extensive recombination. Very small BESs do exist and many functioning BESs do not contain the full complement of expression site associated genes (ESAGs. The consequences of duplicated or missing ESAGs, including ESAG9, a newly named ESAG12, and additional variant surface glycoprotein genes (VSGs were evaluated by functional assays after BESs were tagged with a drug-resistance gene. Phylogenetic analysis of constituent ESAG families suggests that BESs are sequence mosaics and that extensive recombination has shaped the evolution of the BES repertoire. This work opens important perspectives in understanding the molecular mechanisms of antigenic variation, a widely used strategy for immune evasion in pathogens, and telomere biology.

  16. Multi-Layer Extension of the High Efficiency Video Coding (HEVC) Standard

    Institute of Scientific and Technical Information of China (English)

    Ming Li; Ping Wu

    2016-01-01

    Multi⁃layer extension is based on single⁃layer design of High Efficiency Video Coding (HEVC) standard and employed as the com⁃mon structure for scalability and multi⁃view video coding extensions of HEVC. In this paper, an overview of multi⁃layer extension is presented. The concepts and advantages of multi⁃layer extension are briefly described. High level syntax (HLS) for multi⁃layer extension and several new designs are also detailed.

  17. 2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries

    Science.gov (United States)

    Colby, Jennifer

    2015-01-01

    This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…

  18. Highly conserved elements discovered in vertebrates are present in non-syntenic loci of tunicates, act as enhancers and can be transcribed during development

    Science.gov (United States)

    Sanges, Remo; Hadzhiev, Yavor; Gueroult-Bellone, Marion; Roure, Agnes; Ferg, Marco; Meola, Nicola; Amore, Gabriele; Basu, Swaraj; Brown, Euan R.; De Simone, Marco; Petrera, Francesca; Licastro, Danilo; Strähle, Uwe; Banfi, Sandro; Lemaire, Patrick; Birney, Ewan; Müller, Ferenc; Stupka, Elia

    2013-01-01

    Co-option of cis-regulatory modules has been suggested as a mechanism for the evolution of expression sites during development. However, the extent and mechanisms involved in mobilization of cis-regulatory modules remains elusive. To trace the history of non-coding elements, which may represent candidate ancestral cis-regulatory modules affirmed during chordate evolution, we have searched for conserved elements in tunicate and vertebrate (Olfactores) genomes. We identified, for the first time, 183 non-coding sequences that are highly conserved between the two groups. Our results show that all but one element are conserved in non-syntenic regions between vertebrate and tunicate genomes, while being syntenic among vertebrates. Nevertheless, in all the groups, they are significantly associated with transcription factors showing specific functions fundamental to animal development, such as multicellular organism development and sequence-specific DNA binding. The majority of these regions map onto ultraconserved elements and we demonstrate that they can act as functional enhancers within the organism of origin, as well as in cross-transgenesis experiments, and that they are transcribed in extant species of Olfactores. We refer to the elements as ‘Olfactores conserved non-coding elements’. PMID:23393190

  19. Texas High Plains Initiative for Strategic and Innovative Irrigation Management and Conservation

    National Research Council Canada - National Science Library

    Weinheimer, Justin; Johnson, Phillip; Mitchell, Donna; Johnson, Jeff; Kellison, Rick

    2013-01-01

    The strategic management of irrigation applications to improve water‐use efficiency and meet economic objectives has been identified as a key factor in the conservation of water resources in the Texas High Plains region...

  20. A high-speed full-field profilometry with coded laser strips projection

    Science.gov (United States)

    Zhang, Guanliang; Zhou, Xiang; Jin, Rui; Xu, Changda; Li, Dong

    2017-06-01

    Line structure light measurement needs accurate mechanical movement device and high -frame-rate camera, which is difficult to realize. We propose a high-speed full-field profilometry to solve these difficult ies, using coded laser strips projected by a MEMS scanning mirror. The mirror could take place of the mechanical movement device with its high speed and accurate. Besides, a method with gray code and color code is used to decrease the frames number of projection, retaining the advantage of line structure light measurement. In the experiment, we use a laser MEMS scanner and two color cameras. The laser MEMS scanner projects coded stripes, with two color cameras collecting the modulated pattern on the measured object. The color cameras compose a stereo vision system so that the three-dimensional data is reconstructed according to triangulation.

  1. A Compressible High-Order Unstructured Spectral Difference Code for Stratified Convection in Rotating Spherical Shells

    CERN Document Server

    Wang, Junfeng; Miesch, Mark S

    2015-01-01

    We present a novel and powerful Compressible High-ORder Unstructured Spectral-difference (CHORUS) code for simulating thermal convection and related fluid dynamics in the interiors of stars and planets. The computational geometries are treated as rotating spherical shells filled with stratified gas. The hydrodynamic equations are discretized by a robust and efficient high-order Spectral Difference Method (SDM) on unstructured meshes. The computational stencil of the spectral difference method is compact and advantageous for parallel processing. CHORUS demonstrates excellent parallel performance for all test cases reported in this paper, scaling up to 12,000 cores on the Yellowstone High-Performance Computing cluster at NCAR. The code is verified by defining two benchmark cases for global convection in Jupiter and the Sun. CHORUS results are compared with results from the ASH code and good agreement is found. The CHORUS code creates new opportunities for simulating such varied phenomena as multi-scale solar co...

  2. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  3. High-throughput, kingdom-wide prediction and annotation of bacterial non-coding RNAs.

    Directory of Open Access Journals (Sweden)

    Jonathan Livny

    Full Text Available BACKGROUND: Diverse bacterial genomes encode numerous small non-coding RNAs (sRNAs that regulate myriad biological processes. While bioinformatic algorithms have proven effective in identifying sRNA-encoding loci, the lack of tools and infrastructure with which to execute these computationally demanding algorithms has limited their utilization. Genome-wide predictions of sRNA-encoding genes have been conducted in less than 3% of all sequenced bacterial strains, leading to critical gaps in current annotations. The relative paucity of genome-wide sRNA prediction represents a critical gap in current annotations of bacterial genomes and has limited examination of larger issues in sRNA biology, such as sRNA evolution. METHODOLOGY/PRINCIPAL FINDINGS: We have developed and deployed SIPHT, a high throughput computational tool that utilizes workflow management and distributed computing to effectively conduct kingdom-wide predictions and annotations of intergenic sRNA-encoding genes. Candidate sRNA-encoding loci are identified based on the presence of putative Rho-independent terminators downstream of conserved intergenic sequences, and each locus is annotated for several features, including conservation in other species, association with one of several transcription factor binding sites and homology to any of over 300 previously identified sRNAs and cis-regulatory RNA elements. Using SIPHT, we conducted searches for putative sRNA-encoding genes in all 932 bacterial replicons in the NCBI database. These searches yielded nearly 60% of previously confirmed sRNAs, hundreds of previously annotated cis-encoded regulatory RNA elements such as riboswitches, and over 45,000 novel candidate intergenic loci. CONCLUSIONS/SIGNIFICANCE: Candidate loci were identified across all branches of the bacterial evolutionary tree, suggesting a central and ubiquitous role for RNA-mediated regulation among bacterial species. Annotation of candidate loci by SIPHT provides clues

  4. Identification and characterization of novel and conserved microRNAs in radish (Raphanus sativus L.) using high-throughput sequencing.

    Science.gov (United States)

    Xu, Liang; Wang, Yan; Xu, Yuanyuan; Wang, Liangju; Zhai, Lulu; Zhu, Xianwen; Gong, Yiqin; Ye, Shan; Liu, Liwang

    2013-03-01

    MicroRNAs (miRNAs) are endogenous, non-coding, small RNAs that play significant regulatory roles in plant growth, development, and biotic and abiotic stress responses. To date, a great number of conserved and species-specific miRNAs have been identified in many important plant species such as Arabidopsis, rice and poplar. However, little is known about identification of miRNAs and their target genes in radish (Raphanus sativus L.). In the present study, a small RNA library from radish root was constructed and sequenced using the high-throughput Solexa sequencing. Through sequence alignment and secondary structure prediction, a total of 545 conserved miRNA families as well as 15 novel (with their miRNA* strand) and 64 potentially novel miRNAs were identified. Quantitative real-time PCR (qRT-PCR) analysis confirmed that both conserved and novel miRNAs were expressed in radish, and some of them were preferentially expressed in certain tissues. A total of 196 potential target genes were predicted for 42 novel radish miRNAs. Gene ontology (GO) analysis showed that most of the targets were involved in plant growth, development, metabolism and stress responses. This study represents a first large-scale identification and characterization of radish miRNAs and their potential target genes. These results could lead to the further identification of radish miRNAs and enhance our understanding of radish miRNA regulatory mechanisms in diverse biological and metabolic processes.

  5. High-confidence coding and noncoding transcriptome maps

    Science.gov (United States)

    2017-01-01

    The advent of high-throughput RNA sequencing (RNA-seq) has led to the discovery of unprecedentedly immense transcriptomes encoded by eukaryotic genomes. However, the transcriptome maps are still incomplete partly because they were mostly reconstructed based on RNA-seq reads that lack their orientations (known as unstranded reads) and certain boundary information. Methods to expand the usability of unstranded RNA-seq data by predetermining the orientation of the reads and precisely determining the boundaries of assembled transcripts could significantly benefit the quality of the resulting transcriptome maps. Here, we present a high-performing transcriptome assembly pipeline, called CAFE, that significantly improves the original assemblies, respectively assembled with stranded and/or unstranded RNA-seq data, by orienting unstranded reads using the maximum likelihood estimation and by integrating information about transcription start sites and cleavage and polyadenylation sites. Applying large-scale transcriptomic data comprising 230 billion RNA-seq reads from the ENCODE, Human BodyMap 2.0, The Cancer Genome Atlas, and GTEx projects, CAFE enabled us to predict the directions of about 220 billion unstranded reads, which led to the construction of more accurate transcriptome maps, comparable to the manually curated map, and a comprehensive lncRNA catalog that includes thousands of novel lncRNAs. Our pipeline should not only help to build comprehensive, precise transcriptome maps from complex genomes but also to expand the universe of noncoding genomes. PMID:28396519

  6. High-capacity quantum Fibonacci coding for key distribution

    Science.gov (United States)

    Simon, David S.; Lawrence, Nate; Trevino, Jacob; Dal Negro, Luca; Sergienko, Alexander V.

    2013-03-01

    Quantum cryptography and quantum key distribution (QKD) have been the most successful applications of quantum information processing, highlighting the unique capability of quantum mechanics, through the no-cloning theorem, to securely share encryption keys between two parties. Here, we present an approach to high-capacity, high-efficiency QKD by exploiting cross-disciplinary ideas from quantum information theory and the theory of light scattering of aperiodic photonic media. We propose a unique type of entangled-photon source, as well as a physical mechanism for efficiently sharing keys. The key-sharing protocol combines entanglement with the mathematical properties of a recursive sequence to allow a realization of the physical conditions necessary for implementation of the no-cloning principle for QKD, while the source produces entangled photons whose orbital angular momenta (OAM) are in a superposition of Fibonacci numbers. The source is used to implement a particular physical realization of the protocol by randomly encoding the Fibonacci sequence onto entangled OAM states, allowing secure generation of long keys from few photons. Unlike in polarization-based protocols, reference frame alignment is unnecessary, while the required experimental setup is simpler than other OAM-based protocols capable of achieving the same capacity and its complexity grows less rapidly with increasing range of OAM used.

  7. A high-resolution code for large eddy simulation of incompressible turbulent boundary layer flows

    KAUST Repository

    Cheng, Wan

    2014-03-01

    We describe a framework for large eddy simulation (LES) of incompressible turbulent boundary layers over a flat plate. This framework uses a fractional-step method with fourth-order finite difference on a staggered mesh. We present several laminar examples to establish the fourth-order accuracy and energy conservation property of the code. Furthermore, we implement a recycling method to generate turbulent inflow. We use the stretched spiral vortex subgrid-scale model and virtual wall model to simulate the turbulent boundary layer flow. We find that the case with Reθ ≈ 2.5 × 105 agrees well with available experimental measurements of wall friction, streamwise velocity profiles and turbulent intensities. We demonstrate that for cases with extremely large Reynolds numbers (Reθ = 1012), the present LES can reasonably predict the flow with a coarse mesh. The parallel implementation of the LES code demonstrates reasonable scaling on O(103) cores. © 2013 Elsevier Ltd.

  8. Homologous high-throughput expression and purification of highly conserved E coli proteins

    Directory of Open Access Journals (Sweden)

    Duchmann Rainer

    2007-06-01

    Full Text Available Abstract Background Genetic factors and a dysregulated immune response towards commensal bacteria contribute to the pathogenesis of Inflammatory Bowel Disease (IBD. Animal models demonstrated that the normal intestinal flora is crucial for the development of intestinal inflammation. However, due to the complexity of the intestinal flora, it has been difficult to design experiments for detection of proinflammatory bacterial antigen(s involved in the pathogenesis of the disease. Several studies indicated a potential association of E. coli with IBD. In addition, T cell clones of IBD patients were shown to cross react towards antigens from different enteric bacterial species and thus likely responded to conserved bacterial antigens. We therefore chose highly conserved E. coli proteins as candidate antigens for abnormal T cell responses in IBD and used high-throughput techniques for cloning, expression and purification under native conditions of a set of 271 conserved E. coli proteins for downstream immunologic studies. Results As a standardized procedure, genes were PCR amplified and cloned into the expression vector pQTEV2 in order to express proteins N-terminally fused to a seven-histidine-tag. Initial small-scale expression and purification under native conditions by metal chelate affinity chromatography indicated that the vast majority of target proteins were purified in high yields. Targets that revealed low yields after purification probably due to weak solubility were shuttled into Gateway (Invitrogen destination vectors in order to enhance solubility by N-terminal fusion of maltose binding protein (MBP, N-utilizing substance A (NusA, or glutathione S-transferase (GST to the target protein. In addition, recombinant proteins were treated with polymyxin B coated magnetic beads in order to remove lipopolysaccharide (LPS. Thus, 73% of the targeted proteins could be expressed and purified in large-scale to give soluble proteins in the range of 500

  9. High conservation of a 5' element required for RNA editing of a C target in chloroplast psbE transcripts.

    Science.gov (United States)

    Hayes, Michael L; Hanson, Maureen R

    2008-09-01

    C-to-U editing modifies 30-40 distinct nucleotides within higher-plant chloroplast transcripts. Many C targets are located at the same position in homologous genes from different plants; these either could have emerged independently or could share a common origin. The 5' sequence GCCGUU, required for editing of C214 in tobacco psbE in vitro, is one of the few identified editing cis-elements. We investigated psbE sequences from many plant species to determine in what lineage(s) editing of psbE C214 emerged and whether the cis-element identified in tobacco is conserved in plants with a C214. The GCCGUU sequence is present at a high frequency in plants that carry a C214 in psbE. However, Sciadopitys verticillata (Pinophyta) edits C214 despite the presence of nucleotide differences compared to the conserved cis-element. The C214 site in psbE genes is represented in members of four branches of spermatophytes but not in gnetophytes, resulting in the parsimonious prediction that editing of psbE C214 was present in the ancestor of spermatophytes. Extracts from chloroplasts from a species that has a difference in the motif and lacks the C target are incapable of editing tobacco psbE C214 substrates, implying that the critical trans-acting protein factors were not retained without a C target. Because noncoding sequences are less constrained than coding regions, we analyzed sequences 5' to two C editing targets located within coding regions to search for possible editing-related conserved elements. Putative editing cis-elements were uncovered in the 5' UTRs near editing sites psbL C2 and ndhD C2.

  10. Adaptive uniform grayscale coded aperture design for high dynamic range compressive spectral imaging

    Science.gov (United States)

    Diaz, Nelson; Rueda, Hoover; Arguello, Henry

    2016-05-01

    Imaging spectroscopy is an important area with many applications in surveillance, agriculture and medicine. The disadvantage of conventional spectroscopy techniques is that they collect the whole datacube. In contrast, compressive spectral imaging systems capture snapshot compressive projections, which are the input of reconstruction algorithms to yield the underlying datacube. Common compressive spectral imagers use coded apertures to perform the coded projections. The coded apertures are the key elements in these imagers since they define the sensing matrix of the system. The proper design of the coded aperture entries leads to a good quality in the reconstruction. In addition, the compressive measurements are prone to saturation due to the limited dynamic range of the sensor, hence the design of coded apertures must consider saturation. The saturation errors in compressive measurements are unbounded and compressive sensing recovery algorithms only provide solutions for bounded noise or bounded with high probability. In this paper it is proposed the design of uniform adaptive grayscale coded apertures (UAGCA) to improve the dynamic range of the estimated spectral images by reducing the saturation levels. The saturation is attenuated between snapshots using an adaptive filter which updates the entries of the grayscale coded aperture based on the previous snapshots. The coded apertures are optimized in terms of transmittance and number of grayscale levels. The advantage of the proposed method is the efficient use of the dynamic range of the image sensor. Extensive simulations show improvements in the image reconstruction of the proposed method compared with grayscale coded apertures (UGCA) and adaptive block-unblock coded apertures (ABCA) in up to 10 dB.

  11. Comparative analyses of six solanaceous transcriptomes reveal a high degree of sequence conservation and species-specific transcripts

    Directory of Open Access Journals (Sweden)

    Ouyang Shu

    2005-09-01

    Full Text Available Abstract Background The Solanaceae is a family of closely related species with diverse phenotypes that have been exploited for agronomic purposes. Previous studies involving a small number of genes suggested sequence conservation across the Solanaceae. The availability of large collections of Expressed Sequence Tags (ESTs for the Solanaceae now provides the opportunity to assess sequence conservation and divergence on a genomic scale. Results All available ESTs and Expressed Transcripts (ETs, 449,224 sequences for six Solanaceae species (potato, tomato, pepper, petunia, tobacco and Nicotiana benthamiana, were clustered and assembled into gene indices. Examination of gene ontologies revealed that the transcripts within the gene indices encode a similar suite of biological processes. Although the ESTs and ETs were derived from a variety of tissues, 55–81% of the sequences had significant similarity at the nucleotide level with sequences among the six species. Putative orthologs could be identified for 28–58% of the sequences. This high degree of sequence conservation was supported by expression profiling using heterologous hybridizations to potato cDNA arrays that showed similar expression patterns in mature leaves for all six solanaceous species. 16–19% of the transcripts within the six Solanaceae gene indices did not have matches among Solanaceae, Arabidopsis, rice or 21 other plant gene indices. Conclusion Results from this genome scale analysis confirmed a high level of sequence conservation at the nucleotide level of the coding sequence among Solanaceae. Additionally, the results indicated that part of the Solanaceae transcriptome is likely to be unique for each species.

  12. Primary structure and promoter analysis of leghemoglobin genes of the stem-nodulated tropical legume Sesbania rostrata: conserved coding sequences, cis-elements and trans-acting factors

    DEFF Research Database (Denmark)

    Metz, B A; Welters, P; Hoffmann, H J;

    1988-01-01

    The primary structure of a leghemoglobin (lb) gene from the stem-nodulated, tropical legume Sesbania rostrata and two lb gene promoter regions was analysed. The S. rostrata lb gene structure and Lb amino acid composition were found to be highly conserved with previously described lb genes and Lb...

  13. Primary structure and promoter analysis of leghemoglobin genes of the stem-nodulated tropical legume Sesbania rostrata: conserved coding sequences, cis-elements and trans-acting factors

    DEFF Research Database (Denmark)

    Metz, B A; Welters, P; Hoffmann, H J

    1988-01-01

    proteins. Distinct DNA elements were identified in the S. rostrata lb promoter regions, which share a high degree of homology with cis-active regulatory elements found in the soybean (Glycine max) lbc3 promoter. One conserved DNA element was found to interact specifically with an apparently universal...

  14. SIGACE Code for Generating High-Temperature ACE Files; Validation and Benchmarking

    Science.gov (United States)

    Sharma, Amit R.; Ganesan, S.; Trkov, A.

    2005-05-01

    A code named SIGACE has been developed as a tool for MCNP users within the scope of a research contract awarded by the Nuclear Data Section of the International Atomic Energy Agency (IAEA) (Ref: 302-F4-IND-11566 B5-IND-29641). A new recipe has been evolved for generating high-temperature ACE files for use with the MCNP code. Under this scheme the low-temperature ACE file is first converted to an ENDF formatted file using the ACELST code and then Doppler broadened, essentially limited to the data in the resolved resonance region, to any desired higher temperature using SIGMA1. The SIGACE code then generates a high-temperature ACE file for use with the MCNP code. A thinning routine has also been introduced in the SIGACE code for reducing the size of the ACE files. The SIGACE code and the recipe for generating ACE files at higher temperatures has been applied to the SEFOR fast reactor benchmark problem (sodium-cooled fast reactor benchmark described in ENDF-202/BNL-19302, 1974 document). The calculated Doppler coefficient is in good agreement with the experimental value. A similar calculation using ACE files generated directly with the NJOY system also agrees with our SIGACE computed results. The SIGACE code and the recipe is further applied to study the numerical benchmark configuration of selected idealized PWR pin cell configurations with five different fuel enrichments as reported by Mosteller and Eisenhart. The SIGACE code that has been tested with several FENDL/MC files will be available, free of cost, upon request, from the Nuclear Data Section of the IAEA.

  15. Code Division Multiplexing Using AI Based Custom Constellation Scheme - Efficient Modulation for High Data rate Transmission

    Directory of Open Access Journals (Sweden)

    K.SESHADRI SASTRY,

    2010-06-01

    Full Text Available To achieve high bit rate in transmission over wireless channels frequency reuse is an encouraging concept . Rather than dividing allocated frequency spectrum into narrow band width channels , one for each user information is transmitted over a very wide frequency spectrum using the same carrier frequency within same frequency band . In this paper we propose code division multiplexing scheme in which Custom QAM modulator itself is used as code . The propose system is simulated and tested in Matlab 7.4.

  16. FOTOMCAp: a new quasi-automatic code for high-precision photometry

    CERN Document Server

    Petrucci, Romina

    2016-01-01

    The search for Earth-like planets using the transit technique has encouraged the development of strategies to obtain light curves with increasingly precision. In this context we developed the FOTOMCAp program. This is an IRAF quasi-automatic code which employs the aperture correction method and allows to obtain high-precision light curves. In this contribution we describe how this code works and show the results obtained for planetary transits light curves.

  17. FOTOMCAp: a new quasi-automatic code for high-precision photometry

    Science.gov (United States)

    Petrucci, R.; Jofré, J. E.

    2016-08-01

    The search for Earth-like planets using the transit technique has encouraged the development of strategies to obtain light curves with increasing precision. In this context we developed the fotomcap program. This is an iraf quasi-automatic code which employs the aperture correction method and allows to obtain high-precision light curves. In this contribution we describe how this code works and show the results obtained for planetary transits light curves.

  18. Analysis and compensation for code Doppler effect of BDS II signal under high dynamics

    Science.gov (United States)

    Ouyang, Xiaofeng; Zeng, Fangling

    2016-01-01

    In high dynamic circumstances, the acquisition of BDS (BeiDou Navigation Satellite System) signal would be affected by the pseudo-code Doppler. The pseudo-code frequency shift is more prominent and complex when BOC modulation has been adopted by BDS-II, but is not yet involved in current compensation algorithm. In addition, the most frequently used code Doppler compensation algorithm is modifying the sampling rate or local bit rate, which not only increases the complexity of the acquisition and tracking, but also is barely realizable for the hardware receiver to modify the local frequency. Therefore, this paper proposes a code Doppler compensation method based on double estimator receiver, which simultaneously controls NCO delay of code tracking loop and subcarrier tracking loop to compensate for pseudo-code frequency shift. The simulation and test are implemented with BDS-II BOC signal. The test results demonstrate that the proposed algorithm can effectively compensate for pseudo-code Doppler of BOC signal and has detection probability 3dB higher than the uncompensated situation when the false alarm rate is under 0.01 and the coherent integration time is 1ms.

  19. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  20. High sequence conservation among cucumber mosaic virus isolates from lily.

    Science.gov (United States)

    Chen, Y K; Derks, A F; Langeveld, S; Goldbach, R; Prins, M

    2001-08-01

    For classification of Cucumber mosaic virus (CMV) isolates from ornamental crops of different geographical areas, these were characterized by comparing the nucleotide sequences of RNAs 4 and the encoded coat proteins. Within the ornamental-infecting CMV viruses both subgroups were represented. CMV isolates of Alstroemeria and crocus were classified as subgroup II isolates, whereas 8 other isolates, from lily, gladiolus, amaranthus, larkspur, and lisianthus, were identified as subgroup I members. In general, nucleotide sequence comparisons correlated well with geographic distribution, with one notable exception: the analyzed nucleotide sequences of 5 lily isolates showed remarkably high homology despite different origins.

  1. High-order Lagrangian cell-centered conservative scheme on unstructured meshes

    Institute of Scientific and Technical Information of China (English)

    葛全文

    2014-01-01

    A high-order Lagrangian cell-centered conservative gas dynamics scheme is presented on unstructured meshes. A high-order piecewise pressure of the cell is intro-duced. With the high-order piecewise pressure of the cell, the high-order spatial discretiza-tion fluxes are constructed. The time discretization of the spatial fluxes is performed by means of the Taylor expansions of the spatial discretization fluxes. The vertex velocities are evaluated in a consistent manner due to an original solver located at the nodes by means of momentum conservation. Many numerical tests are presented to demonstrate the robustness and the accuracy of the scheme.

  2. Achieving conservation when opportunity costs are high: optimizing reserve design in Alberta's oil sands region.

    Directory of Open Access Journals (Sweden)

    Richard R Schneider

    Full Text Available Recent studies have shown that conservation gains can be achieved when the spatial distributions of biological benefits and economic costs are incorporated in the conservation planning process. Using Alberta, Canada, as a case study we apply these techniques in the context of coarse-filter reserve design. Because targets for ecosystem representation and other coarse-filter design elements are difficult to define objectively we use a trade-off analysis to systematically explore the relationship between conservation targets and economic opportunity costs. We use the Marxan conservation planning software to generate reserve designs at each level of conservation target to ensure that our quantification of conservation and economic outcomes represents the optimal allocation of resources in each case. Opportunity cost is most affected by the ecological representation target and this relationship is nonlinear. Although petroleum resources are present throughout most of Alberta, and include highly valuable oil sands deposits, our analysis indicates that over 30% of public lands could be protected while maintaining access to more than 97% of the value of the region's resources. Our case study demonstrates that optimal resource allocation can be usefully employed to support strategic decision making in the context of land-use planning, even when conservation targets are not well defined.

  3. Achieving conservation when opportunity costs are high: optimizing reserve design in Alberta's oil sands region.

    Science.gov (United States)

    Schneider, Richard R; Hauer, Grant; Farr, Dan; Adamowicz, W L; Boutin, Stan

    2011-01-01

    Recent studies have shown that conservation gains can be achieved when the spatial distributions of biological benefits and economic costs are incorporated in the conservation planning process. Using Alberta, Canada, as a case study we apply these techniques in the context of coarse-filter reserve design. Because targets for ecosystem representation and other coarse-filter design elements are difficult to define objectively we use a trade-off analysis to systematically explore the relationship between conservation targets and economic opportunity costs. We use the Marxan conservation planning software to generate reserve designs at each level of conservation target to ensure that our quantification of conservation and economic outcomes represents the optimal allocation of resources in each case. Opportunity cost is most affected by the ecological representation target and this relationship is nonlinear. Although petroleum resources are present throughout most of Alberta, and include highly valuable oil sands deposits, our analysis indicates that over 30% of public lands could be protected while maintaining access to more than 97% of the value of the region's resources. Our case study demonstrates that optimal resource allocation can be usefully employed to support strategic decision making in the context of land-use planning, even when conservation targets are not well defined.

  4. A Character Segmentation Proposal for High-Speed Visual Monitoring of Expiration Codes on Beverage Cans

    Directory of Open Access Journals (Sweden)

    José C. Rodríguez-Rodríguez

    2016-04-01

    Full Text Available Expiration date labels are ubiquitous in the food industry. With the passage of time, almost any food becomes unhealthy, even when well preserved. The expiration date is estimated based on the type and manufacture/packaging time of that particular food unit. This date is then printed on the container so it is available to the end user at the time of consumption. MONICOD (MONItoring of CODes; an industrial validator of expiration codes; allows the expiration code printed on a drink can to be read. This verification occurs immediately after printing. MONICOD faces difficulties due to the high printing rate (35 cans per second and problematic lighting caused by the metallic surface on which the code is printed. This article describes a solution that allows MONICOD to extract shapes and presents quantitative results for the speed and quality.

  5. A Character Segmentation Proposal for High-Speed Visual Monitoring of Expiration Codes on Beverage Cans.

    Science.gov (United States)

    Rodríguez-Rodríguez, José C; Quesada-Arencibia, Alexis; Moreno-Díaz, Roberto; García, Carmelo R

    2016-04-13

    Expiration date labels are ubiquitous in the food industry. With the passage of time, almost any food becomes unhealthy, even when well preserved. The expiration date is estimated based on the type and manufacture/packaging time of that particular food unit. This date is then printed on the container so it is available to the end user at the time of consumption. MONICOD (MONItoring of CODes); an industrial validator of expiration codes; allows the expiration code printed on a drink can to be read. This verification occurs immediately after printing. MONICOD faces difficulties due to the high printing rate (35 cans per second) and problematic lighting caused by the metallic surface on which the code is printed. This article describes a solution that allows MONICOD to extract shapes and presents quantitative results for the speed and quality.

  6. Developing a Coding Scheme to Analyse Creativity in Highly-constrained Design Activities

    DEFF Research Database (Denmark)

    Dekoninck, Elies; Yue, Huang; Howard, Thomas J.;

    2010-01-01

    of design and analysis on a highly constrained design task. This paper shows how design changes can be coded using a scheme based on creative ‘modes of change’. The coding scheme can show the way a designer moves around the design space, and particularly the strategies that are used by a creative designer......This work is part of a larger project which aims to investigate the nature of creativity and the effectiveness of creativity tools in highly-constrained design tasks. This paper presents the research where a coding scheme was developed and tested with a designer-researcher who conducted two rounds...... to skip from one ‘train of solutions’ to new avenues. The coding scheme can be made more robust by: ensuring design change is always coded relative to a reference design; tightening up definitions of ‘system’, ‘element’ and ‘function’; and using a matrix to develop a more complete set of codes. A much...

  7. MassCode liquid arrays as a tool for multiplexed high-throughput genetic profiling.

    Directory of Open Access Journals (Sweden)

    Gregory S Richmond

    Full Text Available Multiplexed detection assays that analyze a modest number of nucleic acid targets over large sample sets are emerging as the preferred testing approach in such applications as routine pathogen typing, outbreak monitoring, and diagnostics. However, very few DNA testing platforms have proven to offer a solution for mid-plexed analysis that is high-throughput, sensitive, and with a low cost per test. In this work, an enhanced genotyping method based on MassCode technology was devised and integrated as part of a high-throughput mid-plexing analytical system that facilitates robust qualitative differential detection of DNA targets. Samples are first analyzed using MassCode PCR (MC-PCR performed with an array of primer sets encoded with unique mass tags. Lambda exonuclease and an array of MassCode probes are then contacted with MC-PCR products for further interrogation and target sequences are specifically identified. Primer and probe hybridizations occur in homogeneous solution, a clear advantage over micro- or nanoparticle suspension arrays. The two cognate tags coupled to resultant MassCode hybrids are detected in an automated process using a benchtop single quadrupole mass spectrometer. The prospective value of using MassCode probe arrays for multiplexed bioanalysis was demonstrated after developing a 14plex proof of concept assay designed to subtype a select panel of Salmonella enterica serogroups and serovars. This MassCode system is very flexible and test panels can be customized to include more, less, or different markers.

  8. High-Capacity Quantum Secure Direct Communication Based on Quantum Hyperdense Coding with Hyperentanglement

    Institute of Scientific and Technical Information of China (English)

    WANG Tie-Jun; LI Tao; DU Fang-Fang; DENG Fu-Guo

    2011-01-01

    We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication (QSDC)protocol based on this quantum hyperdense coding protocol. This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system. Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present.%@@ We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication(QSDC)protocol based on this quantum hyperdense coding protocol.This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system.Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present.

  9. Regulation of Coding and Non-coding Genes : New insights obtained through analysis of high-throughput sequencing data

    NARCIS (Netherlands)

    K. Rooijers (Koos)

    2016-01-01

    markdownabstractThe genetic code of a cell is kept in its DNA. However, a vast number of functions of a cell are carried out by proteins. Through gene expression the genetic code can be expressed and give rise to proteins. The expression of genes into proteins follows two steps: transcription of DNA

  10. Regulation of Coding and Non-coding Genes : New insights obtained through analysis of high-throughput sequencing data

    NARCIS (Netherlands)

    K. Rooijers (Koos)

    2016-01-01

    markdownabstractThe genetic code of a cell is kept in its DNA. However, a vast number of functions of a cell are carried out by proteins. Through gene expression the genetic code can be expressed and give rise to proteins. The expression of genes into proteins follows two steps: transcription of

  11. Consequences of suppressing natural vegetation in drainage areas for freshwater ecosystem conservation: considerations on the new "Brazilian forest code"

    Directory of Open Access Journals (Sweden)

    Marcelo Henrique Ongaro Pinheiro

    2015-06-01

    Full Text Available The input of particulate and dissolved organic matter (POM and DOM, respectively from terrestrial ecosystem drainage basins is an important energy and nutrient source in limnic food chains. Studies indicated that semi-deciduous seasonal forests located in drainage areas in Brazil have the potential to produce 7.5 - 10.3 Mg ha−1/year of POM. The global increase in vegetation destruction, such as forests, threatens this allochthonous resource and can have significant impacts on river and lake communities and food chains. Therefore, it is critical that exploitation and occupation protocols are updated to protect the transition areas between terrestrial and limnic ecosystems. This review highlights the existing knowledge of these ecosystem interactions and proposes responsible sustainable methods for converting the vegetation in drainage basins. This was based on Brazilian ecosystem data and the new "Brazilian Forest Code." This study also considers the importance of including flood tracks in permanently protected areas to improve Brazilian legislation and protect hydric resources.

  12. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    Science.gov (United States)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to

  13. High-resolution satellite imagery is an important yet underutilized resource in conservation biology.

    Science.gov (United States)

    Boyle, Sarah A; Kennedy, Christina M; Torres, Julio; Colman, Karen; Pérez-Estigarribia, Pastor E; de la Sancha, Noé U

    2014-01-01

    Technological advances and increasing availability of high-resolution satellite imagery offer the potential for more accurate land cover classifications and pattern analyses, which could greatly improve the detection and quantification of land cover change for conservation. Such remotely-sensed products, however, are often expensive and difficult to acquire, which prohibits or reduces their use. We tested whether imagery of high spatial resolution (≤5 m) differs from lower-resolution imagery (≥30 m) in performance and extent of use for conservation applications. To assess performance, we classified land cover in a heterogeneous region of Interior Atlantic Forest in Paraguay, which has undergone recent and dramatic human-induced habitat loss and fragmentation. We used 4 m multispectral IKONOS and 30 m multispectral Landsat imagery and determined the extent to which resolution influenced the delineation of land cover classes and patch-level metrics. Higher-resolution imagery more accurately delineated cover classes, identified smaller patches, retained patch shape, and detected narrower, linear patches. To assess extent of use, we surveyed three conservation journals (Biological Conservation, Biotropica, Conservation Biology) and found limited application of high-resolution imagery in research, with only 26.8% of land cover studies analyzing satellite imagery, and of these studies only 10.4% used imagery ≤5 m resolution. Our results suggest that high-resolution imagery is warranted yet under-utilized in conservation research, but is needed to adequately monitor and evaluate forest loss and conversion, and to delineate potentially important stepping-stone fragments that may serve as corridors in a human-modified landscape. Greater access to low-cost, multiband, high-resolution satellite imagery would therefore greatly facilitate conservation management and decision-making.

  14. High-Resolution Satellite Imagery Is an Important yet Underutilized Resource in Conservation Biology

    Science.gov (United States)

    Boyle, Sarah A.; Kennedy, Christina M.; Torres, Julio; Colman, Karen; Pérez-Estigarribia, Pastor E.; de la Sancha, Noé U.

    2014-01-01

    Technological advances and increasing availability of high-resolution satellite imagery offer the potential for more accurate land cover classifications and pattern analyses, which could greatly improve the detection and quantification of land cover change for conservation. Such remotely-sensed products, however, are often expensive and difficult to acquire, which prohibits or reduces their use. We tested whether imagery of high spatial resolution (≤5 m) differs from lower-resolution imagery (≥30 m) in performance and extent of use for conservation applications. To assess performance, we classified land cover in a heterogeneous region of Interior Atlantic Forest in Paraguay, which has undergone recent and dramatic human-induced habitat loss and fragmentation. We used 4 m multispectral IKONOS and 30 m multispectral Landsat imagery and determined the extent to which resolution influenced the delineation of land cover classes and patch-level metrics. Higher-resolution imagery more accurately delineated cover classes, identified smaller patches, retained patch shape, and detected narrower, linear patches. To assess extent of use, we surveyed three conservation journals (Biological Conservation, Biotropica, Conservation Biology) and found limited application of high-resolution imagery in research, with only 26.8% of land cover studies analyzing satellite imagery, and of these studies only 10.4% used imagery ≤5 m resolution. Our results suggest that high-resolution imagery is warranted yet under-utilized in conservation research, but is needed to adequately monitor and evaluate forest loss and conversion, and to delineate potentially important stepping-stone fragments that may serve as corridors in a human-modified landscape. Greater access to low-cost, multiband, high-resolution satellite imagery would therefore greatly facilitate conservation management and decision-making. PMID:24466287

  15. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea

    Science.gov (United States)

    Bajaj, Deepak; Saxena, Maneesha S.; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D.; Gowda, C. L. L.; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K.; Parida, Swarup K.

    2015-01-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5′-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. PMID:25504138

  16. A Code to Compute High Energy Cosmic Ray Effects on Terrestrial Atmospheric Chemistry

    CERN Document Server

    Krejci, Alex J; Thomas, Brian C

    2008-01-01

    A variety of events such as gamma-ray bursts may expose the Earth to an increased flux of high-energy cosmic rays, with potentially important effects on the biosphere. An atmospheric code, the NASA-Goddard Space Flight Center two-dimensional (latitude, altitude) time-dependent atmospheric model (NGSFC), can be used to study atmospheric chemistry changes. The effect on atmospheric chemistry from astrophysically created high energy cosmic rays can now be studied using the NGSFC code. A table has been created that, with the use of the NGSFC code can be used to simulate the effects of high energy cosmic rays (10 GeV to 1 PeV) ionizing the atmosphere. We discuss the table, its use, weaknesses, and strengths.

  17. Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding.

    Science.gov (United States)

    Cohn, J F; Zlochower, A J; Lien, J; Kanade, T

    1999-01-01

    The face is a rich source of information about human behavior. Available methods for coding facial displays, however, are human-observer dependent, labor intensive, and difficult to standardize. To enable rigorous and efficient quantitative measurement of facial displays, we have developed an automated method of facial display analysis. In this report, we compare the results with this automated system with those of manual FACS (Facial Action Coding System, Ekman & Friesen, 1978a) coding. One hundred university students were videotaped while performing a series of facial displays. The image sequences were coded from videotape by certified FACS coders. Fifteen action units and action unit combinations that occurred a minimum of 25 times were selected for automated analysis. Facial features were automatically tracked in digitized image sequences using a hierarchical algorithm for estimating optical flow. The measurements were normalized for variation in position, orientation, and scale. The image sequences were randomly divided into a training set and a cross-validation set, and discriminant function analyses were conducted on the feature point measurements. In the training set, average agreement with manual FACS coding was 92% or higher for action units in the brow, eye, and mouth regions. In the cross-validation set, average agreement was 91%, 88%, and 81% for action units in the brow, eye, and mouth regions, respectively. Automated face analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.

  18. Identification of immunogenic HLA-B7 "Achilles' heel" epitopes within highly conserved regions of HIV

    DEFF Research Database (Denmark)

    De Groot, Anne S; Rivera, Daniel S; McMurry, Julie A;

    2008-01-01

    previously described as restricted by B7. The HLA-B7 restricted epitopes discovered using this in silico screening approach are highly conserved across strains and clades of HIV as well as conserved in the HIV genome over the 20 years since HIV-1 isolates were first sequenced. This study demonstrates......Genetic polymorphisms in class I human leukocyte antigen molecules (HLA) have been shown to determine susceptibility to HIV infection as well as the rate of progression to AIDS. In particular, the HLA-B7 supertype has been shown to be associated with high viral loads and rapid progression...... to disease. Using a multiplatform in silico/in vitro approach, we have prospectively identified 45 highly conserved, putative HLA-B7 restricted HIV CTL epitopes and evaluated them in HLA binding and ELISpot assays. All 45 epitopes (100%) bound to HLA-B7 in cell-based HLA binding assays: 28 (62%) bound...

  19. TECATE - a code for anisotropic thermoelasticity in high-average-power laser technology. Phase 1 final report

    Energy Technology Data Exchange (ETDEWEB)

    Gelinas, R.J.; Doss, S.K.; Carlson, N.N.

    1985-01-01

    This report describes a totally Eulerian code for anisotropic thermoelasticity (code name TECATE) which may be used in evaluations of prospective crystal media for high-average-power lasers. The present TECATE code version computes steady-state distributions of material temperatures, stresses, strains, and displacement fields in 2-D slab geometry. Numerous heat source and coolant boundary condition options are available in the TECATE code for laser design considerations. Anisotropic analogues of plane stress and plane strain evaluations can be executed for any and all crystal symmetry classes. As with all new and/or large physics codes, it is likely that some code imperfections will emerge at some point in time.

  20. APPLICATION OF TURBO CODES IN HIGH-SPEED REAL-TIME CHANNEL

    Institute of Scientific and Technical Information of China (English)

    Zhao Danfeng; Yue Li; Yang Jianhua

    2006-01-01

    The time delay of Turbo codes due to its iterative decoding is the main bottleneck of its application in real-time channel. However, the time delay can be greatly shortened through the adoption of parallel decoding algorithm, dividing the received bits into several sub-blocks and processing in parallel. This letter mainly discusses the applicability of turbo codes in high-speed real-time channel through the study of a parallel turbo decoding algorithm based on 3GPP-proposed turbo encoder and interleaver in various channel. Simulation result shows that, by choosing an appropriate sub-block length, the time delay can be obviously shortened without degrading the performance and increasing hardware complexity, and furthermore indicates the applicability of Turbo codes in high-speed real-time channel.

  1. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    Science.gov (United States)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  2. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  3. Performance evaluation of high rate space–time trellis-coded modulation using Gauss–Chebyshev quadrature technique

    CSIR Research Space (South Africa)

    Sokoya, O

    2008-05-01

    Full Text Available The performance analysis of high rate space–time trellis-coded modulation (HR-STTCM) using the Gauss–Chebyshev quadrature technique is presented. HR-STTCM is an example of space–time codes that combine the idea used in trellis coded modulation (TCM...

  4. COnstrained Data Extrapolation (CODE): A new approach for high definition vascular imaging from low resolution data.

    Science.gov (United States)

    Song, Yang; Hamtaei, Ehsan; Sethi, Sean K; Yang, Guang; Xie, Haibin; Mark Haacke, E

    2017-09-01

    To introduce a new approach to reconstruct high definition vascular images using COnstrained Data Extrapolation (CODE) and evaluate its capability in estimating vessel area and stenosis. CODE is based on the constraint that the full width half maximum of a vessel can be accurately estimated and, since it represents the best estimate for the width of the object, higher k-space data can be generated from this information. To demonstrate the potential of extracting high definition vessel edges using low resolution data, both simulated and human data were analyzed to better visualize the vessels and to quantify both area and stenosis measurements. The results from CODE using one-fourth of the fully sampled k-space data were compared with a compressed sensing (CS) reconstruction approach using the same total amount of data but spread out between the center of k-space and the outer portions of the original k-space to accelerate data acquisition by a factor of four. For a sufficiently high signal-to-noise ratio (SNR) such as 16 (8), we found that objects as small as 3 voxels in the 25% under-sampled data (6 voxels when zero-filled) could be used for CODE and CS and provide an estimate of area with an error 200 (30) times faster for CODE compared to CS in the simulated (human) data. CODE was capable of producing sharp sub-voxel edges and accurately estimating stenosis to within 5% for clinically relevant studies of vessels with a width of at least 3pixels in the low resolution images. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Analysis of isoplanatic high resolution stellar fields by the StarFinder code

    Science.gov (United States)

    Diolaiti, E.; Bendinelli, O.; Bonaccini, D.; Close, L.; Currie, D.; Parmeggiani, G.

    2000-12-01

    We describe a new code for the deep analysis of stellar fields, designed for Adaptive Optics (AO) Nyquist-sampled images with high and low Strehl ratio. The Point Spread Function (PSF) is extracted directly from the image frame, to take into account the actual structure of the instrumental response and the atmospheric effects. The code is written in IDL language and organized in the form of a self-contained widget-based application, provided with a series of tools for data visualization and analysis. A description of the method and some applications to AO data are presented.

  6. High Pitch Delay Resolution Technique for Tonal Language Speech Coding Based on Multi-Pulse Based Code Excited Linear Prediction Algorithm

    Directory of Open Access Journals (Sweden)

    Suphattharachai Chomphan

    2011-01-01

    Full Text Available Problem statement: In spontaneous speech communication, speech coding is an important process that should be taken into account, since the quality of coded speech depends on the efficiency of the speech coding algorithm. As for tonal language which tone plays important role not only on the naturalness and also the intelligibility of the speech, tone must be treated appropriately. Approach: This study proposes a modification of flexible Multi-Pulse based Code Excited Linear Predictive (MP-CELP coder with multiple bitrates and bitrate scalabilities for tonal language speech in the multimedia applications. The coder consists of a core coder and bitrate scalable tools. The High Pitch Delay Resolutions (HPDR are applied to the adaptive codebook of core coder for tonal language speech quality improvement. The bitrate scalable tool employs multi-stage excitation coding based on an embedded-coding approach. The multi-pulse excitation codebook at each stage is adaptively produced depending on the selected excitation signal at the previous stage. Results: The experimental results show that the speech quality of the proposed coder is improved above the speech quality of the conventional coder without pitch-resolution adaptation. Conclusion: From the study, it is a strong evidence to further apply the proposed technique in the speech coding systems or other speech processing technologies.

  7. Bucketing Coding and Information Theory for the Statistical High Dimensional Nearest Neighbor Problem

    CERN Document Server

    Dubiner, Moshe

    2008-01-01

    Consider the problem of finding high dimensional approximate nearest neighbors, where the data is generated by some known probabilistic model. We will investigate a large natural class of algorithms which we call bucketing codes. We will define bucketing information, prove that it bounds the performance of all bucketing codes, and that the bucketing information bound can be asymptotically attained by randomly constructed bucketing codes. For example suppose we have n Bernoulli(1/2) very long (length d-->infinity) sequences of bits. Let n-2m sequences be completely independent, while the remaining 2m sequences are composed of m independent pairs. The interdependence within each pair is that their bits agree with probability 1/20. Moreover if one sequence out of each pair belongs to a a known set of n^{(2p-1)^{2}-\\epsilon} sequences, than pairing can be done using order n comparisons!

  8. A comparison of cosmological Boltzmann codes: are we ready for high precision cosmology?

    CERN Document Server

    Seljak, U; White, M; Zaldarriaga, M

    2003-01-01

    We compare three independent, cosmological linear perturbation theory codes to asses the level of agreement between them and to improve upon it by investigating the sources of discrepancy. By eliminating the major sources of numerical instability the final level of agreement between the codes was improved by an order of magnitude. The relative error is now below 0.1% for the dark matter power spectrum. For the cosmic microwave background anisotropies the agreement is below the sampling variance up to l=3000, with close to 0.1% accuracy reached over most of this range of scales. The same level of agreement is also achieved for the polarization spectrum and the temperature-polarization cross-spectrum. Linear perturbation theory codes are thus well prepared for the present and upcoming high precision cosmological observations.

  9. High Angular Momentum Halo Gas: a Feedback and Code-Independent Prediction of LCDM

    CERN Document Server

    Stewart, Kyle; Oñorbe, Jose; Bullock, James; Joung, M Ryan; Devriendt, Julien; Ceverino, Daniel; Kereš, Dušan; Hopkins, Phil; Faucher-Giguère, Claude-André

    2016-01-01

    We investigate angular momentum acquisition in Milky Way sized galaxies by comparing five high resolution zoom-in simulations, each implementing identical cosmological initial conditions, but utilizing different hydrodynamic codes: Enzo, Art, Ramses, Arepo, and Gizmo-PSPH. Each code implements a distinct set of feedback and star formation prescriptions. We find that while many galaxy and halo properties vary between the different codes (and feedback prescriptions), there is qualitative agreement on the process of angular momentum acquisition in the galaxy's halo. In all simulations, cold filamentary gas accretion to the halo results in ~4 times more specific angular momentum in cold halo gas ($\\lambda_{cold} \\simeq 0.15$) than in the dark matter halo. At z>1, this inflow frequently results in the formation of transient cold flow disks---large co-rotating gaseous structures in the halo of the galaxy that are fueled, aligned, and kinematically connected to filamentary gas infall along the cosmic web. Due to the...

  10. High-speed architecture for the decoding of trellis-coded modulation

    Science.gov (United States)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  11. Efficient temporal and interlayer parameter prediction for weighted prediction in scalable high efficiency video coding

    Science.gov (United States)

    Tsang, Sik-Ho; Chan, Yui-Lam; Siu, Wan-Chi

    2017-01-01

    Weighted prediction (WP) is an efficient video coding tool that was introduced since the establishment of the H.264/AVC video coding standard, for compensating the temporal illumination change in motion estimation and compensation. WP parameters, including a multiplicative weight and an additive offset for each reference frame, are required to be estimated and transmitted to the decoder by slice header. These parameters cause extra bits in the coded video bitstream. High efficiency video coding (HEVC) provides WP parameter prediction to reduce the overhead. Therefore, WP parameter prediction is crucial to research works or applications, which are related to WP. Prior art has been suggested to further improve the WP parameter prediction by implicit prediction of image characteristics and derivation of parameters. By exploiting both temporal and interlayer redundancies, we propose three WP parameter prediction algorithms, enhanced implicit WP parameter, enhanced direct WP parameter derivation, and interlayer WP parameter, to further improve the coding efficiency of HEVC. Results show that our proposed algorithms can achieve up to 5.83% and 5.23% bitrate reduction compared to the conventional scalable HEVC in the base layer for SNR scalability and 2× spatial scalability, respectively.

  12. High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique

    Science.gov (United States)

    Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan

    2015-10-01

    In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.

  13. Multipath sparse coding for scene classification in very high resolution satellite imagery

    Science.gov (United States)

    Fan, Jiayuan; Tan, Hui Li; Lu, Shijian

    2015-10-01

    With the rapid development of various satellite sensors, automatic and advanced scene classification technique is urgently needed to process a huge amount of satellite image data. Recently, a few of research works start to implant the sparse coding for feature learning in aerial scene classification. However, these previous research works use the single-layer sparse coding in their system and their performances are highly related with multiple low-level features, such as scale-invariant feature transform (SIFT) and saliency. Motivated by the importance of feature learning through multiple layers, we propose a new unsupervised feature learning approach for scene classification on very high resolution satellite imagery. The proposed unsupervised feature learning utilizes multipath sparse coding architecture in order to capture multiple aspects of discriminative structures within complex satellite scene images. In addition, the dense low-level features are extracted from the raw satellite data by using different image patches with varying size at different layers, and this approach is not limited to a particularly designed feature descriptors compared with the other related works. The proposed technique has been evaluated on two challenging high-resolution datasets, including the UC Merced dataset containing 21 different aerial scene categories with a 1 foot resolution and the Singapore dataset containing 5 land-use categories with a 0.5m spatial resolution. Experimental results show that it outperforms the state-of-the-art that uses the single-layer sparse coding. The major contributions of this proposed technique include (1) a new unsupervised feature learning approach to generate feature representation for very high-resolution satellite imagery, (2) the first multipath sparse coding that is used for scene classification in very high-resolution satellite imagery, (3) a simple low-level feature descriptor instead of many particularly designed low-level descriptor

  14. 人类基因组中的保守非编码元件%Conserved non-coding elements in human genome

    Institute of Scientific and Technical Information of China (English)

    田靖; 赵志虎; 陈惠鹏

    2009-01-01

    Study of comparative genomics has revealed that about 5% of the human genome are under purifying selection, 3.5% of which are conserved non-coding elements (CNEs). While the coding regions comprise of only a small part. In human, the CNEs are functionally important, which may be associated with the process of the establishment and maintain of chromatin architecture, transcription regulation, and pre-mRNA processing. They are also related to ontogeny of mammals and human diseases. This review outlined the identification, functional significance, evolutionary origin, and effects on human genetic defects of the CNEs.%比较基因组学的研究发现:人类基因组中约5%的序列受到选择压力的限制,但编码序列只占其中很小一部分,约3.5%是保守、非编码序列.这些保守非编码元件具有重要功能.可能在染色质构型(高级结构)、DNA转录和RNA加工等不同水平参与了基因的表达调控,与哺乳动物的形态发生和人类疾病相关.文章简要综述了保守非编码元件的识别、功能及验证、起源演化以及与人类疾病的关系.

  15. EVALUATION OF WATER CONSERVATION POLICY ALTERNATIVES FOR THE SOUTHERN HIGH PLAINS OF TEXAS

    OpenAIRE

    Johnson, Jeffrey W.; Johnson, Phillip N.; Segarra, Eduardo; Willis, David B.

    2004-01-01

    Three alternative groundwater conservation policies were examined for their impact on the regional economy of the Southern High Plains of Texas using nonlinear optimization models and an input-output model. Restriction of drawdown of the aquifer was found to be more effective than proposed water use fees.

  16. EVALUATION OF WATER CONSERVATION POLICY ALTERNATIVES FOR THE SOUTHERN HIGH PLAINS OF TEXAS

    OpenAIRE

    Johnson, Jeffrey W.; Johnson, Phillip N.; Segarra, Eduardo; Willis, David B.

    2004-01-01

    Three alternative groundwater conservation policies were examined for their impact on the regional economy of the Southern High Plains of Texas using nonlinear optimization models and an input-output model. Restriction of drawdown of the aquifer was found to be more effective than proposed water use fees.

  17. Prospects for Parity Non-conservation Experiments with Highly Charged Heavy Ions

    OpenAIRE

    Maul, M.; A. Schäfer; Greiner, W.; Indelicato, P.

    1996-01-01

    We discuss the prospects for parity non-conservation experiments with highly charged heavy ions. Energy levels and parity mixing for heavy ions with two to five electrons are calculated. We investigate two-photon-transitions and the possibility to observe interference effects between weak-matrix elements and Stark matrix elements for periodic electric field configurations.

  18. Prospects for Parity Non-conservation Experiments with Highly Charged Heavy Ions

    OpenAIRE

    Maul, M.; Schäfer, A.; Greiner, W.; Indelicato, P.

    1996-01-01

    We discuss the prospects for parity non-conservation experiments with highly charged heavy ions. Energy levels and parity mixing for heavy ions with two to five electrons are calculated. We investigate two-photon-transitions and the possibility to observe interference effects between weak-matrix elements and Stark matrix elements for periodic electric field configurations.

  19. Load balancing in highly parallel processing of Monte Carlo code for particle transport

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Takemiya, Hiroshi [Japan Atomic Energy Research Inst., Tokyo (Japan); Kawasaki, Takuji [Fuji Research Institute Corporation, Tokyo (Japan)

    2001-01-01

    In parallel processing of Monte Carlo(MC) codes for neutron, photon and electron transport problems, particle histories are assigned to processors making use of independency of the calculation for each particle. Although we can easily parallelize main part of a MC code by this method, it is necessary and practically difficult to optimize the code concerning load balancing in order to attain high speedup ratio in highly parallel processing. In fact, the speedup ratio in the case of 128 processors remains in nearly one hundred times when using the test bed for the performance evaluation. Through the parallel processing of the MCNP code, which is widely used in the nuclear field, it is shown that it is difficult to attain high performance by static load balancing in especially neutron transport problems, and a load balancing method, which dynamically changes the number of assigned particles minimizing the sum of the computational and communication costs, overcomes the difficulty, resulting in nearly fifteen percentage of reduction for execution time. (author)

  20. The Student Opinions Concerning Freedom of Dress Code Including High Schools Among Others

    OpenAIRE

    Ahmet Akbaba

    2014-01-01

    This study aimed at examining the opinions of high school students concerning the much-debated dress code applied to primary school, middle school, and high school students and thought to affect quality in education, and revealing the importance of the issue as well as its financial, social, and pedagogical dimensions. The research is a descriptive study in survey model. A Likert-type questionnaire developed by the researcher was used in the present study – a descriptive ...

  1. Acute cholecystitis in high-risk patients: percutaneous cholecystostomy vs conservative treatment

    Energy Technology Data Exchange (ETDEWEB)

    Hatzidakis, Adam A.; Prassopoulos, Panos; Petinarakis, Ioannis; Gourtsoyiannis, Nicholas C. [Department of Radiology, University Hospital of Heraklion, Medical School of Crete, Crete (Greece); Sanidas, Elias; Tsiftsis, Dimitrios [Department of Surgical Oncology, University Hospital of Heraklion, Medical School of Crete (Greece); Chrysos, Emmanuel; Chalkiadakis, Georgios [Department of General Surgery, University Hospital of Heraklion, Medical School of Crete (Greece)

    2002-07-01

    Our objective was to compare the effectiveness of percutaneous cholecystostomy (PC) vs conservative treatment (CO) in high-risk patients with acute cholecystitis. The study was randomized and comprised 123 high-risk patients with acute cholecystitis. All patients fulfilled the ultrasonographic criteria of acute inflammation and had an APACHE II score {>=}12. Percutaneous cholecystostomy guided by US or CT was successful in 60 of 63 patients (95.2%) who comprised the PC group. Sixty patients were conservatively treated (CO group). One patient died after unsuccessful PC (1.6%). Resolution of symptoms occurred in 54 of 63 patients (86%). Eleven patients (17.5%) died either of ongoing sepsis (n=6) or severe underlying disease (n=5) within 30 days. Seven patients (11%) were operated on because of persisting symptoms (n=3), catheter dislodgment (n=3), or unsuccessful PC (n=1). Cholecystolithotripsy was performed in 5 patients (8%). Elective surgery was performed in 9 cases (14%). No further treatment was needed in 32 patients (51%). In the CO group, 52 patients (87%) fully recovered and 8 patients (13%) died of ongoing sepsis within 30 days. All successfully treated patients showed clinical improvement during the first 3 days of treatment. Percutaneous cholecystostomy in high-risk patients with acute cholecystitis did not decrease mortality in relation to conservative treatment. Percutaneous cholecystostomy might be suggested to patients not presenting clinical improvement following 3 days of conservative treatment, to critically ill intensive care unit patients, or to candidates for percutaneous cholecystolithotripsy. (orig.)

  2. High-Penetration Photovoltaics Standards and Codes Workshop, Denver, Colorado, May 20, 2010: Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, M.; Kroposki, B.; Basso, T.; Lynn, K.; Herig, C.; Bower, W.

    2010-09-01

    Effectively interconnecting high-level penetration of photovoltaic (PV) systems requires careful technical attention to ensuring compatibility with electric power systems. Standards, codes, and implementation have been cited as major impediments to widespread use of PV within electric power systems. On May 20, 2010, in Denver, Colorado, the National Renewable Energy Laboratory, in conjunction with the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), held a workshop to examine the key technical issues and barriers associated with high PV penetration levels with an emphasis on codes and standards. This workshop included building upon results of the High Penetration of Photovoltaic (PV) Systems into the Distribution Grid workshop held in Ontario California on February 24-25, 2009, and upon the stimulating presentations of the diverse stakeholder presentations.

  3. Evaluation of conservative management of high-grade cervical squamous intraepithelial lesion.

    Science.gov (United States)

    Uchimura, Nelson Shozo; Uchimura, Taqueco Teruya; Martins, João Paulo de Oliveira Branco; Assakawa, Fernando; Uchimura, Liza Yurie Teruya

    2012-06-01

    To assess the association between conservative management of high-grade cervical squamous intraepithelial lesions and recurrence rates and age groups. Cross-sectional, retrospective, analytical observational study of 509 women (aged 15 to 76) with abnormal Pap smears attending a public reference center in the city of Maringá, southern Brazil, from 1996 to 2006. Data was collected from medical records, and the variables definitive diagnosis, type of treatment provided, occurrence of high-grade cervical squamous intraepithelial lesions and recurrence were studied. Pearson's chi-square test and Fisher's exact test were used in the statistical analyses. There were 168 cases of cervical high-grade cervical squamous intraepithelial lesions, of these, 31 were treated with cold-knife conization, 104 loop electrosurgical excision procedure, 9 hysterectomy and 24 conservative treatment (i.e., clinical and cytological follow-up or cervical electrocoagulation). A total of 8 (33.3%) women receiving conservative and 10 (6.9%) receiving non-conservative management had recurrent disease and this difference was statistically significant (p=0.0009), PR = 4.8 (95%CI 2.11;10.93). Three (30.0%) women among those undergoing clinical and cytological follow-up and five 5 (35.7%) among those submitted to cervical electrocoagulation had recurrent disease within three years, but the difference was not significant (p=0.5611). Recurrent rates in those younger and older than 30 were 13.8% (7 women) and 12.2% (11 women) (p = 0.9955). Age is not a predictor of disease recurrence. Conservative treatment is only recommended in exceptional situations due to its high recurrence rates. Careful cytological and colposcopic follow-up is required for three years when most recurrences occur.

  4. Predictors of nephrectomy in high grade blunt renal trauma patients treated primarily with conservative intent.

    Science.gov (United States)

    Prasad, Narla Hari; Devraj, Rahul; Chandriah, G Ram; Sagar, S Vidya; Reddy, Ch Ram; Murthy, Pisapati Venkata Lakshmi Narsimha

    2014-04-01

    There is no consensus on the optimal management of high grade renal trauma. Delayed surgery increases the likelihood of secondary hemorrhage and persistent urinary extravasation, whereas immediate surgery results in high renal loss. Hence, the present study was undertaken to evaluate the predictors of nephrectomy and outcome of high Grade (III-V) renal injury, treated primarily with conservative intent. The records of 55 patients who were admitted to our institute with varying degrees of blunt renal trauma from January 2005 to December 2012 were retrospectively reviewed. Grade III-V renal injury was defined as high grade blunt renal trauma and was present in 44 patients. The factors analyzed to predict emergency intervention were demographic profile, grade of injury, degree of hemodynamic instability, requirement of blood transfusion, need for intervention, mode of intervention, and duration of intensive care unit stay. Rest of the 40 patients with high grade injury (grade 3 and 4)did not require emergency intervention and underwent a trail of conservative management. 7 of the 40 patients with high grade renal injury (grade 3 and 4), who were managed conservatively experienced complications requiring procedural intervention and three required a delayed nephrectomy. Presence of grade V injuries with hemodynamic instability and requirement of more than 10 packed cell units for resuscitation were predictors of nephrectomy. Predictors of complications were urinary extravasation and hemodynamic instability at presentation. Majority of the high grade renal injuries can be successfully managed conservatively. Grade V injuries and the need for more packed cell transfusions during resuscitation predict the need for emergency intervention.

  5. Predictors of nephrectomy in high grade blunt renal trauma patients treated primarily with conservative intent

    Directory of Open Access Journals (Sweden)

    Narla Hari Prasad

    2014-01-01

    Full Text Available Introduction: There is no consensus on the optimal management of high grade renal trauma. Delayed surgery increases the likelihood of secondary hemorrhage and persistent urinary extravasation, whereas immediate surgery results in high renal loss. Hence, the present study was undertaken to evaluate the predictors of nephrectomy and outcome of high Grade (III-V renal injury, treated primarily with conservative intent. Materials and Methods: The records of 55 patients who were admitted to our institute with varying degrees of blunt renal trauma from January 2005 to December 2012 were retrospectively reviewed. Grade III-V renal injury was defined as high grade blunt renal trauma and was present in 44 patients. The factors analyzed to predict emergency intervention were demographic profile, grade of injury, degree of hemodynamic instability, requirement of blood transfusion, need for intervention, mode of intervention, and duration of intensive care unit stay. Results: Rest of the 40 patients with high grade injury (grade 3 and 4 did not require emergency intervention and underwent a trail of conservative management. 7 of the 40 patients with high grade renal injury (grade 3 and 4, who were managed conservatively experienced complications requiring procedural intervention and three required a delayed nephrectomy. Presence of grade V injuries with hemodynamic instability and requirement of more than 10 packed cell units for resuscitation were predictors of nephrectomy. Predictors of complications were urinary extravasation and hemodynamic instability at presentation. Conclusion: Majority of the high grade renal injuries can be successfully managed conservatively. Grade V injuries and the need for more packed cell transfusions during resuscitation predict the need for emergency intervention.

  6. Interpreting beta-diversity components over time to conserve metacommunities in highly dynamic ecosystems.

    Science.gov (United States)

    Ruhí, Albert; Datry, Thibault; Sabo, John L

    2017-02-11

    The concept of metacommunity (i.e., a set of local communities linked by dispersal) has gained great popularity among community ecologists. However, metacommunity research mostly addresses questions on spatial patterns of biodiversity at the regional scale, whereas conservation planning requires quantifying temporal variation in those metacommunities and the contributions that individual (local) sites make to regional dynamics. We propose that recent advances in diversity-partitioning methods may allow for a better understanding of metacommunity dynamics and the identification of keystone sites. We used time series of the 2 components of beta diversity (richness and replacement) and the contributions of local sites to these components to examine which sites controlled source-sink dynamics in a highly dynamic model system (an intermittent river). The relative importance of the richness and replacement components of beta diversity fluctuated over time, and sample aggregation led to underestimation of beta diversity by up to 35%. Our literature review revealed that research on intermittent rivers would benefit greatly from examination of beta-diversity components over time. Adequately appraising spatiotemporal variability in community composition and identifying sites that are pivotal for maintaining biodiversity at the landscape scale are key needs for conservation prioritization and planning. Thus, our framework may be used to guide conservation actions in highly dynamic ecosystems when time-series data describing biodiversity across sites connected by dispersal are available. © 2017 Society for Conservation Biology.

  7. Genes involved in complex adaptive processes tend to have highly conserved upstream regions in mammalian genomes

    Directory of Open Access Journals (Sweden)

    Kohane Isaac

    2005-11-01

    Full Text Available Abstract Background Recent advances in genome sequencing suggest a remarkable conservation in gene content of mammalian organisms. The similarity in gene repertoire present in different organisms has increased interest in studying regulatory mechanisms of gene expression aimed at elucidating the differences in phenotypes. In particular, a proximal promoter region contains a large number of regulatory elements that control the expression of its downstream gene. Although many studies have focused on identification of these elements, a broader picture on the complexity of transcriptional regulation of different biological processes has not been addressed in mammals. The regulatory complexity may strongly correlate with gene function, as different evolutionary forces must act on the regulatory systems under different biological conditions. We investigate this hypothesis by comparing the conservation of promoters upstream of genes classified in different functional categories. Results By conducting a rank correlation analysis between functional annotation and upstream sequence alignment scores obtained by human-mouse and human-dog comparison, we found a significantly greater conservation of the upstream sequence of genes involved in development, cell communication, neural functions and signaling processes than those involved in more basic processes shared with unicellular organisms such as metabolism and ribosomal function. This observation persists after controlling for G+C content. Considering conservation as a functional signature, we hypothesize a higher density of cis-regulatory elements upstream of genes participating in complex and adaptive processes. Conclusion We identified a class of functions that are associated with either high or low promoter conservation in mammals. We detected a significant tendency that points to complex and adaptive processes were associated with higher promoter conservation, despite the fact that they have emerged

  8. Optimizing the Search for High-z GRBs: The JANUS X-ray Coded Aperture Telescope

    CERN Document Server

    Burrows, D N; Palmer, D; Romano, P; Mangano, V; La Parola, V; Falcone, A D; Roming, P W A

    2011-01-01

    We discuss the optimization of gamma-ray burst (GRB) detectors with a goal of maximizing the detected number of bright high-redshift GRBs, in the context of design studies conducted for the X-ray transient detector on the JANUS mission. We conclude that the optimal energy band for detection of high-z GRBs is below about 30 keV. We considered both lobster-eye and coded aperture designs operating in this energy band. Within the available mass and power constraints, we found that the coded aperture mask was preferred for the detection of high-z bursts with bright enough afterglows to probe galaxies in the era of the Cosmic Dawn. This initial conclusion was confirmed through detailed mission simulations that found that the selected design (an X-ray Coded Aperture Telescope) would detect four times as many bright, high-z GRBs as the lobster-eye design we considered. The JANUS XCAT instrument will detect 48 GRBs with z > 5 and fluence Sx > 3 {\\times} 10-7 erg cm-2 in a two year mission.

  9. Crystal structure of AFV3-109, a highly conserved protein from crenarchaeal viruses

    Directory of Open Access Journals (Sweden)

    Quevillon-Cheruel Sophie

    2007-01-01

    Full Text Available Abstract The extraordinary morphologies of viruses infecting hyperthermophilic archaea clearly distinguish them from bacterial and eukaryotic viruses. Moreover, their genomes code for proteins that to a large extend have no related sequences in the extent databases. However, a small pool of genes is shared by overlapping subsets of these viruses, and the most conserved gene, exemplified by the ORF109 of the Acidianus Filamentous Virus 3, AFV3, is present on genomes of members of three viral familes, the Lipothrixviridae, Rudiviridae, and "Bicaudaviridae", as well as of the unclassified Sulfolobus Turreted Icosahedral Virus, STIV. We present here the crystal structure of the protein (Mr = 13.1 kD, 109 residues encoded by the AFV3 ORF 109 in two different crystal forms at 1.5 and 1.3 Å resolution. The structure of AFV3-109 is a five stranded β-sheet with loops on one side and three helices on the other. It forms a dimer adopting the shape of a cradle that encompasses the best conserved regions of the sequence. No protein with a related fold could be identified except for the ortholog from STIV1, whose structure was deposited at the Protein Data Bank. We could clearly identify a well bound glycerol inside the cradle, contacting exclusively totally conserved residues. This interaction was confirmed in solution by fluorescence titration. Although the function of AFV3-109 cannot be deduced directly from its structure, structural homology with the STIV1 protein, and the size and charge distribution of the cavity suggested it could interact with nucleic acids. Fluorescence quenching titrations also showed that AFV3-109 interacts with dsDNA. Genomic sequence analysis revealed bacterial homologs of AFV3-109 as a part of a putative previously unidentified prophage sequences in some Firmicutes.

  10. A fully parallel, high precision, N-body code running on hybrid computing platforms

    CERN Document Server

    Capuzzo-Dolcetta, R; Punzo, D

    2012-01-01

    We present a new implementation of the numerical integration of the classical, gravitational, N-body problem based on a high order Hermite's integration scheme with block time steps, with a direct evaluation of the particle-particle forces. The main innovation of this code (called HiGPUs) is its full parallelization, exploiting both OpenMP and MPI in the use of the multicore Central Processing Units as well as either Compute Unified Device Architecture (CUDA) or OpenCL for the hosted Graphic Processing Units. We tested both performance and accuracy of the code using up to 256 GPUs in the supercomputer IBM iDataPlex DX360M3 Linux Infiniband Cluster provided by the italian supercomputing consortium CINECA, for values of N up to 8 millions. We were able to follow the evolution of a system of 8 million bodies for few crossing times, task previously unreached by direct summation codes. The code is freely available to the scientific community.

  11. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  12. High-capacity quantum key distribution using Chebyshev-map values corresponding to Lucas numbers coding

    Science.gov (United States)

    Lai, Hong; Orgun, Mehmet A.; Pieprzyk, Josef; Li, Jing; Luo, Mingxing; Xiao, Jinghua; Xiao, Fuyuan

    2016-08-01

    We propose an approach that achieves high-capacity quantum key distribution using Chebyshev-map values corresponding to Lucas numbers coding. In particular, we encode a key with the Chebyshev-map values corresponding to Lucas numbers and then use k-Chebyshev maps to achieve consecutive and flexible key expansion and apply the pre-shared classical information between Alice and Bob and fountain codes for privacy amplification to solve the security of the exchange of classical information via the classical channel. Consequently, our high-capacity protocol does not have the limitations imposed by orbital angular momentum and down-conversion bandwidths, and it meets the requirements for longer distances and lower error rates simultaneously.

  13. Quasi Cyclic Low Density Parity Check Code for High SNR Data Transfer

    Directory of Open Access Journals (Sweden)

    M. R. Islam

    2010-06-01

    Full Text Available An improved Quasi Cyclic Low Density Parity Check code (QC-LDPC is proposed to reduce the complexity of the Low Density Parity Check code (LDPC while obtaining the similar performance. The proposed QC-LDPC presents an improved construction at high SNR with circulant sub-matrices. The proposed construction yields a performance gain of about 1 dB at a 0.0003 bit error rate (BER and it is tested on 4 different decoding algorithms. Proposed QC-LDPC is compared with the existing QC-LDPC and the simulation results show that the proposed approach outperforms the existing one at high SNR. Simulations are also performed varying the number of horizontal sub matrices and the results show that the parity check matrix with smaller horizontal concatenation shows better performance.

  14. High-capacity three-party quantum secret sharing with superdense coding

    Institute of Scientific and Technical Information of China (English)

    Gu Bin; Li Chuan-Qi; Xu Fei; Chen Yu-Lin

    2009-01-01

    This paper presents a scheme for high-capacity three-party quantum secret sharing with quantum superdense coding, following some ideas in the work by Liu et al (2002 Phys. Rev. A 65 022304) and the quantum secret sharing scheme by Deng et al (2008 Phys. Left. A 372 1957). Instead of using two sets of nonorthogonal states, the boss Alice needs only to prepare a sequence of Einstein-Podolsky-Rosen pairs in d-dimension. The two agents Bob and Charlie encode their information with dense coding unitary operations, and security is checked by inserting decoy photons. The scheme has a high capacity and intrinsic efficiency as each pair can carry 21bd bits of information, and almost all the pairs can be used for carrying useful information.

  15. High-capacity quantum key distribution using Chebyshev-map values corresponding to Lucas numbers coding

    Science.gov (United States)

    Lai, Hong; Orgun, Mehmet A.; Pieprzyk, Josef; Li, Jing; Luo, Mingxing; Xiao, Jinghua; Xiao, Fuyuan

    2016-11-01

    We propose an approach that achieves high-capacity quantum key distribution using Chebyshev-map values corresponding to Lucas numbers coding. In particular, we encode a key with the Chebyshev-map values corresponding to Lucas numbers and then use k-Chebyshev maps to achieve consecutive and flexible key expansion and apply the pre-shared classical information between Alice and Bob and fountain codes for privacy amplification to solve the security of the exchange of classical information via the classical channel. Consequently, our high-capacity protocol does not have the limitations imposed by orbital angular momentum and down-conversion bandwidths, and it meets the requirements for longer distances and lower error rates simultaneously.

  16. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    Science.gov (United States)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements.

  17. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  18. The Fisher Kernel Coding Framework for High Spatial Resolution Scene Classification

    Directory of Open Access Journals (Sweden)

    Bei Zhao

    2016-02-01

    Full Text Available High spatial resolution (HSR image scene classification is aimed at bridging the semantic gap between low-level features and high-level semantic concepts, which is a challenging task due to the complex distribution of ground objects in HSR images. Scene classification based on the bag-of-visual-words (BOVW model is one of the most successful ways to acquire the high-level semantic concepts. However, the BOVW model assigns local low-level features to their closest visual words in the “visual vocabulary” (the codebook obtained by k-means clustering, which discards too many useful details of the low-level features in HSR images. In this paper, a feature coding method under the Fisher kernel (FK coding framework is introduced to extend the BOVW model by characterizing the low-level features with a gradient vector instead of the count statistics in the BOVW model, which results in a significant decrease in the codebook size and an acceleration of the codebook learning process. By considering the differences in the distributions of the ground objects in different regions of the images, local FK (LFK is proposed for the HSR image scene classification method. The experimental results show that the proposed scene classification methods under the FK coding framework can greatly reduce the computational cost, and can obtain a better scene classification accuracy than the methods based on the traditional BOVW model.

  19. Assessment of the effects of farming and conservation programs on pesticide deposition in high plains wetlands.

    Science.gov (United States)

    Belden, Jason B; Hanson, Brittany Rae; McMurry, Scott T; Smith, Loren M; Haukos, David A

    2012-03-20

    We examined pesticide contamination in sediments from depressional playa wetlands embedded in the three dominant land-use types in the western High Plains and Rainwater Basin of the United States including cropland, perennial grassland enrolled in conservation programs (e.g., Conservation Reserve Program [CRP]), and native grassland or reference condition. Two hundred and sixty four playas, selected from the three land-use types, were sampled from Nebraska and Colorado in the north to Texas and New Mexico in the south. Sediments were examined for most of the commonly used agricultural pesticides. Atrazine, acetochlor, metolachlor, and trifluralin were the most commonly detected pesticides in the northern High Plains and Rainwater Basin. Atrazine, metolachlor, trifluralin, and pendimethalin were the most commonly detected pesticides in the southern High Plains. The top 5-10% of playas contained herbicide concentrations that are high enough to pose a hazard for plants. However, insecticides and fungicides were rarely detected. Pesticide occurrence and concentrations were higher in wetlands surrounded by cropland as compared to native grassland and CRP perennial grasses. The CRP, which is the largest conservation program in the U.S., was protective and had lower pesticide concentrations compared to cropland.

  20. The myofibrillar protein, projectin, is highly conserved across insect evolution except for its PEVK domain.

    Science.gov (United States)

    Ayme-Southgate, Agnes J; Southgate, Richard J; Philipp, Richard A; Sotka, Erik E; Kramp, Catherine

    2008-12-01

    All striated muscles respond to stretch by a delayed increase in tension. This physiological response, known as stretch activation, is, however, predominantly found in vertebrate cardiac muscle and insect asynchronous flight muscles. Stretch activation relies on an elastic third filament system composed of giant proteins known as titin in vertebrates or kettin and projectin in insects. The projectin insect protein functions jointly as a "scaffold and ruler" system during myofibril assembly and as an elastic protein during stretch activation. An evolutionary analysis of the projectin molecule could potentially provide insight into how distinct protein regions may have evolved in response to different evolutionary constraints. We mined candidate genes in representative insect species from Hemiptera to Diptera, from published and novel genome sequence data, and carried out a detailed molecular and phylogenetic analysis. The general domain organization of projectin is highly conserved, as are the protein sequences of its two repeated regions-the immunoglobulin type C and fibronectin type III domains. The conservation in structure and sequence is consistent with the proposed function of projectin as a scaffold and ruler. In contrast, the amino acid sequences of the elastic PEVK domains are noticeably divergent, although their length and overall unusual amino acid makeup are conserved. These patterns suggest that the PEVK region working as an unstructured domain can still maintain its dynamic, and even its three-dimensional, properties, without the need for strict amino acid conservation. Phylogenetic analysis of the projectin proteins also supports a reclassification of the Hymenoptera in relation to Diptera and Coleoptera.

  1. THATCH: A computer code for modelling thermal networks of high- temperature gas-cooled nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kroeger, P.G.; Kennett, R.J.; Colman, J.; Ginsberg, T. (Brookhaven National Lab., Upton, NY (United States))

    1991-10-01

    This report documents the THATCH code, which can be used to model general thermal and flow networks of solids and coolant channels in two-dimensional r-z geometries. The main application of THATCH is to model reactor thermo-hydraulic transients in High-Temperature Gas-Cooled Reactors (HTGRs). The available modules simulate pressurized or depressurized core heatup transients, heat transfer to general exterior sinks or to specific passive Reactor Cavity Cooling Systems, which can be air or water-cooled. Graphite oxidation during air or water ingress can be modelled, including the effects of added combustion products to the gas flow and the additional chemical energy release. A point kinetics model is available for analyzing reactivity excursions; for instance due to water ingress, and also for hypothetical no-scram scenarios. For most HTGR transients, which generally range over hours, a user-selected nodalization of the core in r-z geometry is used. However, a separate model of heat transfer in the symmetry element of each fuel element is also available for very rapid transients. This model can be applied coupled to the traditional coarser r-z nodalization. This report described the mathematical models used in the code and the method of solution. It describes the code and its various sub-elements. Details of the input data and file usage, with file formats, is given for the code, as well as for several preprocessing and postprocessing options. The THATCH model of the currently applicable 350 MW{sub th} reactor is described. Input data for four sample cases are given with output available in fiche form. Installation requirements and code limitations, as well as the most common error indications are listed. 31 refs., 23 figs., 32 tabs.

  2. Energy conservation and high-frequency damping in numerical time integration

    DEFF Research Database (Denmark)

    Krenk, Steen

    2008-01-01

    Momentum and energy conserving time integration procedures are receiving increased interest due to the central role of conservation properties in relation to the problems under investigation. However, most problems in structural dynamics are based on models that are first discretized in space, en...... this often leads to a fairly large number of high-frequency modes, that are not represented well – and occasionally directly erroneously – by the model. It is desirable to cure this problem by devising algorithms that include the possibility of introducing algorithmic energy dissipation of the high......-frequency modes. The problem is well known from classic collocation based algorithms – notably various forms of the Newmark algorithm – where the equation of motion is supplemented by approximate relations between displacement, velocity and acceleration. Here adjustment of the algorithmic parameters can be used...

  3. Energy conservation and high-frequency damping in numerical time integration

    DEFF Research Database (Denmark)

    Krenk, Steen

    2008-01-01

    Momentum and energy conserving time integration procedures are receiving increased interest due to the central role of conservation properties in relation to the problems under investigation. However, most problems in structural dynamics are based on models that are first discretized in space, en...... this often leads to a fairly large number of high-frequency modes, that are not represented well – and occasionally directly erroneously – by the model. It is desirable to cure this problem by devising algorithms that include the possibility of introducing algorithmic energy dissipation of the high......-frequency modes. The problem is well known from classic collocation based algorithms – notably various forms of the Newmark algorithm – where the equation of motion is supplemented by approximate relations between displacement, velocity and acceleration. Here adjustment of the algorithmic parameters can be used...

  4. Energy conservation and high-frequency damping in numerical time-integration

    DEFF Research Database (Denmark)

    Krenk, Steen

    2007-01-01

    Momentum and energy conserving time integration procedures are receiving increased interest due to the central role of conservation properties in relation to the problems under investigation. However, most problems in structural dynamics are based on models that are first discretized in space, en...... this often leads to a fairly large number of high-frequency modes, that are not represented well - and occasionally directly erroneously - by the model. It is desirable to cure this problem by devising algorithms that include the possibility of introducing algorithmic energy dissipation of the high......-frequency modes. The problem is well known from classic collocation based algorithms - notably various forms of the Newmark algorithm where the equation of motion is supplemented by approximate relations between displacement, velocity and acceleration. Here adjustment of the algorithmic parameters can be used...

  5. Targeted carbon conservation at national scales with high-resolution monitoring.

    Science.gov (United States)

    Asner, Gregory P; Knapp, David E; Martin, Roberta E; Tupayachi, Raul; Anderson, Christopher B; Mascaro, Joseph; Sinca, Felipe; Chadwick, K Dana; Higgins, Mark; Farfan, William; Llactayo, William; Silman, Miles R

    2014-11-25

    Terrestrial carbon conservation can provide critical environmental, social, and climate benefits. Yet, the geographically complex mosaic of threats to, and opportunities for, conserving carbon in landscapes remain largely unresolved at national scales. Using a new high-resolution carbon mapping approach applied to Perú, a megadiverse country undergoing rapid land use change, we found that at least 0.8 Pg of aboveground carbon stocks are at imminent risk of emission from land use activities. Map-based information on the natural controls over carbon density, as well as current ecosystem threats and protections, revealed three biogeographically explicit strategies that fully offset forthcoming land-use emissions. High-resolution carbon mapping affords targeted interventions to reduce greenhouse gas emissions in rapidly developing tropical nations.

  6. Entropy Viscosity Method for High-Order Approximations of Conservation Laws

    KAUST Repository

    Guermond, J. L.

    2010-09-17

    A stabilization technique for conservation laws is presented. It introduces in the governing equations a nonlinear dissipation function of the residual of the associated entropy equation and bounded from above by a first order viscous term. Different two-dimensional test cases are simulated - a 2D Burgers problem, the "KPP rotating wave" and the Euler system - using high order methods: spectral elements or Fourier expansions. Details on the tuning of the parameters controlling the entropy viscosity are given. © 2011 Springer.

  7. Current non-conservation effects in ultra-high energy neutrino interactions

    CERN Document Server

    Fiore, R

    2010-01-01

    The overall hardness scale of the ultra-high energy neutrino-nucleon interactions is usually estimated as $Q^2\\sim m_W^2$. The effect of non-conservation of weak currents pushes this scale up to the top quark mass squared and changes dynamics of the scattering process. The Double Leading Log Approximation provides simple and numerically accurate formula for the top-bottom contribution to the total cross section $\\sigma^{\

  8. A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding

    Science.gov (United States)

    Simon, M. K.; Divsalar, D.

    2001-01-01

    Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.

  9. Conservative high-order-accurate finite-difference methods for curvilinear grids

    Science.gov (United States)

    Rai, Man M.; Chakrvarthy, Sukumar

    1993-01-01

    Two fourth-order-accurate finite-difference methods for numerically solving hyperbolic systems of conservation equations on smooth curvilinear grids are presented. The first method uses the differential form of the conservation equations; the second method uses the integral form of the conservation equations. Modifications to these schemes, which are required near boundaries to maintain overall high-order accuracy, are discussed. An analysis that demonstrates the stability of the modified schemes is also provided. Modifications to one of the schemes to make it total variation diminishing (TVD) are also discussed. Results that demonstrate the high-order accuracy of both schemes are included in the paper. In particular, a Ringleb-flow computation demonstrates the high-order accuracy and the stability of the boundary and near-boundary procedures. A second computation of supersonic flow over a cylinder demonstrates the shock-capturing capability of the TVD methodology. An important contribution of this paper is the dear demonstration that higher order accuracy leads to increased computational efficiency.

  10. A High-Throughput Binary Arithmetic Coding Architecture for H.264/AVC CABAC

    Science.gov (United States)

    Liu, Yizhong; Song, Tian; Shimamoto, Takashi

    In this paper, we propose a high-throughput binary arithmetic coding architecture for CABAC (Context Adaptive Binary Arithmetic Coding) which is one of the entropy coding tools used in the H.264/AVC main and high profiles. The full CABAC encoding functions, including binarization, context model selection, arithmetic encoding and bits generation, are implemented in this proposal. The binarization and context model selection are implemented in a proposed binarizer, in which a FIFO is used to pack the binarization results and output 4 bins in one clock. The arithmetic encoding and bits generation are implemented in a four-stage pipeline with the encoding ability of 4 bins/clock. In order to improve the processing speed, the context variables access and update for 4 bins are paralleled and the pipeline path is balanced. Also, because of the outstanding bits issue, a bits packing and generation strategy for 4 bins paralleled processing is proposed. After implemented in verilog-HDL and synthesized with Synopsys Design Compiler using 90nm libraries, this proposal can work at the clock frequency of 250MHz and takes up about 58K standard cells, 3.2Kbits register files and 27.6K bits ROM. The throughput of processing 1000M bins per second can be achieved in this proposal for the HDTV applications.

  11. Single stock dynamics on high-frequency data: from a compressed coding perspective.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    Full Text Available High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS, and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors.

  12. Single stock dynamics on high-frequency data: from a compressed coding perspective.

    Science.gov (United States)

    Fushing, Hsieh; Chen, Shu-Chun; Hwang, Chii-Ruey

    2014-01-01

    High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS), and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors.

  13. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  14. Analytical Study of High Pitch Delay Resolution Technique for Tonal Speech Coding

    Directory of Open Access Journals (Sweden)

    Suphattharachai Chomphan

    2012-01-01

    Full Text Available Problem statement: In tonal-language speech, since tone plays important role not only on the naturalness and also the intelligibility of the speech, it must be treated appropriately in a speech coder algorithm. Approach: This study proposes an analytical study of the technique of High Pitch Delay Resolutions (HPDR applied to the adaptive codebook of core coder of Multi-Pulse based Code Excited Linear Predictive (MP-CELP coder. Results: The experimental results show that the speech quality of the MP-CELP speech coder with HPDR technique is improved above the speech quality of the conventional coder. An optimum resolution of pitch delay is also presented. Conclusion: From the analytical study, it has been found that the proposed technique can improve the speech coding quality.

  15. Stitching Codeable Circuits: High School Students' Learning About Circuitry and Coding with Electronic Textiles

    Science.gov (United States)

    Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.

    2017-10-01

    Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.

  16. A HIGH-PERFORMANCE VLSI ARCHITECTURE OF EBCOT BLOCK CODING IN JPEG2000

    Institute of Scientific and Technical Information of China (English)

    Liu Kai; Wu Chengke; Li Yunsong

    2006-01-01

    The paper presents a new architecture composed of bit plane-parallel coder for Embedded Block Coding with Optimized Truncation (EBCOT) entropy encoder used in JPEG2000. In the architecture, the coding information of each bit plane can be obtained simultaneously and processed parallel. Compared with other architectures, it has advantages of high parallelism, and no waste clock cycles for a single point. The experimental results show that it reduces the processing time about 86% than that of bit plane sequential scheme. A Field Programmable Gate Array (FPGA) prototype chip is designed and simulation results show that it can process 512×512 gray-scaled images with more than 30 frames per second at 52MHz.

  17. A Two-Dimensional Fem Code for Impedance Calculation in High Frequency Domain

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lanfa; /SLAC; Lee, Lie-Quan; /SLAC; Stupakov, Gennady; /SLAC

    2010-08-25

    A new method, using the parabolic equation (PE), for the calculation of both high-frequency impedances of small-angle taper (or collimator) is developed in [1]. One of the most important advantages of the PE approach is that it eliminates the spatial scale of the small wavelength from the problem. As a result, only coarser spatial meshes are needed in calculating the numerical solution of the PE. We developed a new code based on Finite Element Method (FEM) which can handle arbitrary profile of a transition and speed up the calculation by orders of magnitude. As a first step, we completed and benchmarked a two-dimensional code. It can be upgraded to three-dimensional geometry.

  18. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

    Science.gov (United States)

    Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

    2004-01-01

    This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

  19. A high-order public domain code for direct numerical simulations of turbulent combustion

    CERN Document Server

    Babkovskaia, N; Brandenburg, A

    2010-01-01

    A high-order scheme for direct numerical simulations of turbulent combustion is discussed. Its implementation in the massively parallel and publicly available Pencil Code is validated with the focus on hydrogen combustion. Ignition delay times (0D) and laminar flame velocities (1D) are calculated and compared with results from the commercially available Chemkin code. The scheme is verified to be fifth order in space. Upon doubling the resolution, a 32-fold increase in the accuracy of the flame front is demonstrated. Finally, also turbulent and spherical flame front velocities are calculated and the implementation of the non-reflecting so-called Navier-Stokes Characteristic Boundary Condition is validated in all three directions.

  20. A high capacity text steganography scheme based on LZW compression and color coding

    Directory of Open Access Journals (Sweden)

    Aruna Malik

    2017-02-01

    Full Text Available In this paper, capacity and security issues of text steganography have been considered by employing LZW compression technique and color coding based approach. The proposed technique uses the forward mail platform to hide the secret data. This algorithm first compresses secret data and then hides the compressed secret data into the email addresses and also in the cover message of the email. The secret data bits are embedded in the message (or cover text by making it colored using a color coding table. Experimental results show that the proposed method not only produces a high embedding capacity but also reduces computational complexity. Moreover, the security of the proposed method is significantly improved by employing stego keys. The superiority of the proposed method has been experimentally verified by comparing with recently developed existing techniques.

  1. Irregular and regular LDPC codes with high spectrum efficiency modulation in image transmission over fading channel

    Institute of Scientific and Technical Information of China (English)

    Ma Piming; Yuan Dongfeng

    2005-01-01

    If the degree distribution is chosen carefully, the irregular low-density parity-check (LDPC) codes can outperform the regular ones. An image transmission system is proposed by combining regular and irregular LDPC codes with 16QAM/64QAM modulation to improve both efficiency and reliability. Simulaton results show that LDPC codes are good coding schemes over fading channel in image communication with lower system complexity. More over, irregular codes can obtain a code gain of about 0.7 dB compared with regular ones when BER is 10-4. So the irregular LDPC codes are more suitable for image transmission than the regular codes.

  2. Game-Theoretic Rate-Distortion-Complexity Optimization of High Efficiency Video Coding

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Milani, Simone; Forchhammer, Søren

    2013-01-01

    This paper presents an algorithm for rate-distortioncomplexity optimization for the emerging High Efficiency Video Coding (HEVC) standard, whose high computational requirements urge the need for low-complexity optimization algorithms. Optimization approaches need to specify different complexity...... profiles in order to tailor the computational load to the different hardware and power-supply resources of devices. In this work, we focus on optimizing the quantization parameter and partition depth in HEVC via a game-theoretic approach. The proposed rate control strategy alone provides 0.2 dB improvement...

  3. Lithographically encoded polymer microtaggant using high-capacity and error-correctable QR code for anti-counterfeiting of drugs.

    Science.gov (United States)

    Han, Sangkwon; Bae, Hyung Jong; Kim, Junhoi; Shin, Sunghwan; Choi, Sung-Eun; Lee, Sung Hoon; Kwon, Sunghoon; Park, Wook

    2012-11-20

    A QR-coded microtaggant for the anti-counterfeiting of drugs is proposed that can provide high capacity and error-correction capability. It is fabricated lithographically in a microfluidic channel with special consideration of the island patterns in the QR Code. The microtaggant is incorporated in the drug capsule ("on-dose authentication") and can be read by a simple smartphone QR Code reader application when removed from the capsule and washed free of drug.

  4. Wavefront coding for fast, high-resolution light-sheet microscopy (Conference Presentation)

    Science.gov (United States)

    Olarte, Omar E.; Licea-Rodriguez, Jacob; Loza-Alvarez, Pablo

    2017-02-01

    Some biological experiments demand the observation of dynamics processes in 3D with high spatiotemporal resolution. The use of wavefront coding to extend the depth-of-field (DOF) of the collection arm of a light-sheet microscope is an interesting alternative for fast 3D imaging. Under this scheme, the 3D features of the sample are captured at high volumetric rates while the light sheet is swept rapidly within the extended DOF. The DOF is extended by coding the pupil function of the imaging lens by using a custom-designed phase mask. A posterior restoration step is required to decode the information of the captured images based on the applied phase mask [1]. This hybrid optical-digital approach is known as wavefront coding (WFC). Previously, we have demonstrated this method for performing fast 3D imaging of biological samples at medium resolution [2]. In this work, we present the extension of this approach for high-resolution microscopes. Under these conditions, the effective DOF of a standard high NA objective is of a few micrometers. Here we demonstrate that by the use of WFC, we can extend the DOF more than one order of magnitude keeping the high-resolution imaging. This is demonstrated for two designed phase masks using Zebrafish and C. elegans samples. [1] Olarte, O.E., Andilla, J., Artigas, D., and Loza-Alvarez, P., "Decoupled Illumination-Detection Microscopy. Selected Optics in Year 2105," in Optics and Photonics news 26, p. 41 (2015). [2] Olarte, O.E., Andilla, J., Artigas, D., and Loza-Alvarez, P., "Decoupled illumination detection in light sheet microscopy for fast volumetric imaging," Optica 2(8), 702 (2015).

  5. Implementing Scientific Simulation Codes Highly Tailored for Vector Architectures Using Custom Configurable Computing Machines

    Science.gov (United States)

    Rutishauser, David

    2006-01-01

    The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters

  6. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2009-09-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electroyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  7. High-speed, high-precision thermal printing heads for bar code printers; Kosoku koseisai bar cord printer yo TPH

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Thermal printing heads (TPHs), having the world's first resolution of 24 dots/mm and printing speed of 254 mm/s, have been developed. The high-precision, high-durability TPH is realized, based on the high-precision techniques as one of the company's strong areas, combined with the techniques for high power-resistant film structure and high wear-resistant protective film. At the same time, the structure of high thermal conductivity and thermal efficiency is adopted, to control heat accumulation and realize high-quality images. It is expected to find wide use in various areas, centered by distribution industry, e.g., for bar code label printers, and name plate, postal card and name card printing, with the standardized recording width of A6 size and resolution of 8 to 24 dots/mm. (translated by NEDO)

  8. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  9. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    Science.gov (United States)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  10. High Performance P3M N-body code: CUBEP3M

    CERN Document Server

    Harnois-Deraps, Joachim; Iliev, Ilian T; Merz, Hugh; Emberson, J D; Desjacques, Vincent

    2012-01-01

    This paper presents CUBEP3M, a high performance, publicly-available, cosmological N-body code and describes many utilities and extensions that have been added to the standard package, including a runtime halo finder, a non-Gaussian initial conditions generator, a tuneable accuracy, and a system of unique particle identification. CUBEP3M is fast, has a memory imprint up to three times lower than other widely used N-body codes, and has been run on up to 20,000 cores, achieving close to ideal weak scaling even at this problem size. It is well suited and has already been used for a broad number of science applications that require either large samples of non-linear realizations or very large dark matter N-body simulations, including cosmological reionization, baryonic acoustic oscillations, weak lensing or non-Gaussian statistics. We discuss the structure, the accuracy, any known systematic effects, and the scaling performance of the code and its utilities, when applicable.

  11. Enhanced UK teletext: Experimental equipment for high-quality picture coding and other enhancements

    Science.gov (United States)

    Riley, J. L.

    1983-07-01

    The construction of a pair of microcomputer-based units which will serve as a research tool in engineering teletext enhancements, one unit a transmitter and the other a receiver in a closed-circuit teletext transmission is described. The microcomputer system and frame store design is basically similar to that currently used by Logica in the Flair electronic graphics equipment. Considerable effort was devoted to the development of software handling. The units are equipped with a CP/M operating system, which is already widely known. This greatly simplifies the management of files and includes compiling routines and a software debugging tool. Software is being prepared in PASCAL. Routines were developed to receive, generate, edit and transmit teletext in its present form. Early attention was given to demonstrating that picture teletext is feasible through a crudely-coded slow-scan television system. An acceptable sampling structure for these pictures and optimizing the coding of data to fit in with a hierarchy of coding embracing other aspects of enhanced teletext, for example, geometric drawing, electronic painting and telesoftware are addressed. A high quality character font was incorporated to improve the display of text.

  12. A Benchmarking Study of High Energy Carbon Ion Induced Neutron Using Several Monte Carlo Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Oh, J. H.; Jung, N. S.; Lee, H. S. [Pohang Accelerator Laboratory, Pohang (Korea, Republic of); Shin, Y. S.; Kwon, D. Y.; Kim, Y. M. [Catholic Univ., Gyeongsan (Korea, Republic of); Oranj, L. Mokhtari [POSTECH, Pohang (Korea, Republic of)

    2014-10-15

    In this study, the benchmarking study was done for the representative particle interaction of the heavy ion accelerator, especially carbon-induced reaction. The secondary neutron is an important particle in the shielding analysis to define the source term and penetration ability of radiation fields. The performance of each Monte Carlo codes were verified for selected codes: MCNPX 2.7, PHITS 2.64 and FLUKA 2011.2b.6. For this benchmarking study, the experimental data of Kurosawa et al. in the SINBAD database of NEA was applied. The calculated results of the differential neutron yield produced from several materials irradiated by high energy carbon beam reproduced the experimental data well in small uncertainty. But the MCNPX results showed large discrepancy with experimental data, especially at the forward angle. The calculated results were lower a little than the experimental and it was clear in the cases of lower incident carbon energy, thinner target and forward angle. As expected, the influence of different model was found clearly at forward direction. In the shielding analysis, these characteristics of each Monte Carlo codes should be considered and utilized to determine the safety margin of a shield thickness.

  13. Verification of high-energy transport codes on the basis of activation data

    CERN Document Server

    Titarenko, Yu E; Butko, M A; Dikarev, D V; Florya, S N; Pavlov, K V; Titarenko, A Yu; Tikhonov, R S; Zhivun, V M; Ignatyuk, A V; Mashnik, S G; Boudard, A; Leray, S; David, J -C; Cugnon, J; Mancusi, D; Yariv, Y; Kumawat, H; Nishihara, K; Matsuda, N; Mank, G; Gudowski, W

    2011-01-01

    Nuclide production cross sections measured at ITEP for the targets of nat-Cr, 56-Fe, nat-Ni, 93-Nb, 181-Ta, nat-W, nat-Pb, 209-Bi irradiated by protons with energies from 40 to 2600 MeV were used to estimate the predictive accuracy of several popular high-energy transport codes. A general agreement of the ITEP data with the data obtained by other groups, including the numerous GSI data measured by the inverse kinematics method was found. Simulations of the measured data were performed with the MCNPX (Bertini and ISABEL options), CEM03.02, INCL4.2+ABLA, INCL4.5+ABLA07, PHITS, and CASCADE.07 codes. Deviation factors between the calculated and experimental cross sections have been estimated for each target and for the whole energy range covered by our measurements. Two-dimensional diagrams of deviation factor values were produced for estimating the predictive power of every code for intermediate, not measured masses of nuclei-targets and bombarding energies of protons. Further improvements of all tested here cod...

  14. New high burnup fuel models for NRC`s licensing audit code, FRAPCON

    Energy Technology Data Exchange (ETDEWEB)

    Lanning, D.D.; Beyer, C.E.; Painter, C.L. [Pacific Northwest Laboratory, Richland, WA (United States)

    1996-03-01

    Fuel behavior models have recently been updated within the U.S. Nuclear Regulatory Commission steady-state FRAPCON code used for auditing of fuel vendor/utility-codes and analyses. These modeling updates have concentrated on providing a best estimate prediction of steady-state fuel behavior up to the maximum burnup level s of current data (60 to 65 GWd/MTU rod-average). A decade has passed since these models were last updated. Currently, some U.S. utilities and fuel vendors are requesting approval for rod-average burnups greater than 60 GWd/MTU; however, until these recent updates the NRC did not have valid fuel performance models at these higher burnup levels. Pacific Northwest Laboratory (PNL) has reviewed 15 separate effects models within the FRAPCON fuel performance code (References 1 and 2) and identified nine models that needed updating for improved prediction of fuel behavior at high burnup levels. The six separate effects models not updated were the cladding thermal properties, cladding thermal expansion, cladding creepdown, fuel specific heat, fuel thermal expansion and open gap conductance. Comparison of these models to the currently available data indicates that these models still adequately predict the data within data uncertainties. The nine models identified as needing improvement for predicting high-burnup behavior are fission gas release (FGR), fuel thermal conductivity (accounting for both high burnup effects and burnable poison additions), fuel swelling, fuel relocation, radial power distribution, fuel-cladding contact gap conductance, cladding corrosion, cladding mechanical properties and cladding axial growth. Each of the updated models will be described in the following sections and the model predictions will be compared to currently available high burnup data.

  15. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  16. Microcollinearity in an ethylene receptor coding gene region of the Coffea canephora genome is extensively conserved with Vitis vinifera and other distant dicotyledonous sequenced genomes

    Directory of Open Access Journals (Sweden)

    Campa Claudine

    2009-02-01

    Full Text Available Abstract Background Coffea canephora, also called Robusta, belongs to the Rubiaceae, the fourth largest angiosperm family. This diploid species (2x = 2n = 22 has a fairly small genome size of ≈ 690 Mb and despite its extreme economic importance, particularly for developing countries, knowledge on the genome composition, structure and evolution remain very limited. Here, we report the 160 kb of the first C. canephora Bacterial Artificial Chromosome (BAC clone ever sequenced and its fine analysis. Results This clone contains the CcEIN4 gene, encoding an ethylene receptor, and twenty other predicted genes showing a high gene density of one gene per 7.8 kb. Most of them display perfect matches with C. canephora expressed sequence tags or show transcriptional activities through PCR amplifications on cDNA libraries. Twenty-three transposable elements, mainly Class II transposon derivatives, were identified at this locus. Most of these Class II elements are Miniature Inverted-repeat Transposable Elements (MITE known to be closely associated with plant genes. This BAC composition gives a pattern similar to those found in gene rich regions of Solanum lycopersicum and Medicago truncatula genomes indicating that the CcEIN4 regions may belong to a gene rich region in the C. canephora genome. Comparative sequence analysis indicated an extensive conservation between C. canephora and most of the reference dicotyledonous genomes studied in this work, such as tomato (S. lycopersicum, grapevine (V. vinifera, barrel medic M. truncatula, black cottonwood (Populus trichocarpa and Arabidopsis thaliana. The higher degree of microcollinearity was found between C. canephora and V. vinifera, which belong respectively to the Asterids and Rosids, two clades that diverged more than 114 million years ago. Conclusion This study provides a first glimpse of C. canephora genome composition and evolution. Our data revealed a remarkable conservation of the microcollinearity

  17. Water Wisdom: 23 Stand-Alone Activities on Water Supply and Water Conservation for High School Students. 2nd Edition.

    Science.gov (United States)

    Massachusetts State Water Resources Authority, Boston.

    This water conservation education program for high schools consists of both stand-alone activities and teacher support materials. Lessons are divided into six broad categories: (1) The Water Cycle; (2) Water and Society; (3) Keeping Water Pure; (4) Visualizing Volumes; (5) The Economics of Water Use; and (6) Domestic Water Conservation. The…

  18. Species Richness and Community Structure on a High Latitude Reef: Implications for Conservation and Management

    Directory of Open Access Journals (Sweden)

    Wayne Houston

    2011-07-01

    Full Text Available In spite of the wealth of research on the Great Barrier Reef, few detailed biodiversity assessments of its inshore coral communities have been conducted. Effective conservation and management of marine ecosystems begins with fine-scale biophysical assessments focused on diversity and the architectural species that build the structural framework of the reef. In this study, we investigate key coral diversity and environmental attributes of an inshore reef system surrounding the Keppel Bay Islands near Rockhampton in Central Queensland, Australia, and assess their implications for conservation and management. The Keppels has much higher coral diversity than previously found. The average species richness for the 19 study sites was ~40 with representatives from 68% of the ~244 species previously described for the southern Great Barrier Reef. Using scleractinian coral species richness, taxonomic distinctiveness and coral cover as the main criteria, we found that five out of 19 sites had particularly high conservation value. A further site was also considered to be of relatively high value. Corals at this site were taxonomically distinct from the others (representatives of two families were found here but not at other sites and a wide range of functionally diverse taxa were present. This site was associated with more stressful conditions such as high temperatures and turbidity. Highly diverse coral communities or biodiversity ‘hotspots’ and taxonomically distinct reefs may act as insurance policies for climatic disturbance, much like Noah’s Arks for reefs. While improving water quality and limiting anthropogenic impacts are clearly important management initiatives to improve the long-term outlook for inshore reefs, identifying, mapping and protecting these coastal ‘refugia’ may be the key for ensuring their regeneration against catastrophic climatic disturbance in the meantime.

  19. National-scale analysis for the identification of High Conservation Value Forests (HCVFs

    Directory of Open Access Journals (Sweden)

    Maesano M

    2011-02-01

    Full Text Available In Italy, forests cover about one third of the national territory. In recent years, sustainability has been applied to forest management through the introduction of the Sustainable Forest Management (SFM concept. Since the Rio Conference, several initiatives at international and governmental level aimed to realize the SFM concept by the establishment of a set of principles with general validity. One of the most successful initiatives is the Forest Stewardship Council (FSC, which has developed a system of voluntary certification specific for the forestry sector, as well as 10 principles and 56 criteria for good forest management. The concept of High Conservation Value Forest concept (HCVFs was defined in 1999 by FSC under Principle 9, and its application requires the identification of six categories of High Conservation Values (HCV. The aim of this study was to define the parameters for the HCVFs Italian forests, A first national mapping for the first level of High Conservation Value was developed focusing on protected areas, threatened and endangered species and the ecosystemic temporal use. Protected areas may constitute the basis of the SFM. This work is the result of data processing and distribution analysis through the intersection of vectorial data of national forests areas in ArcMap, on the basis of available information. Protected forest areas represent 34% of the national forest area. The different categories of protected areas contribute differently to protection, in particular the larger amount of preserved forests (22.96% falls within Sites of Community Importance (SCI. The analysis of highly protected forest types revealed major differences likely linked to site ecological conditions, which are extremely variable over the country. The HCVF concept is applied in the forest certification field and can be used in sustainable forest management, planning and land use, and policy commitments.

  20. Fundamental algorithm and computational codes for the light beam propagation in high power laser system

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The fundamental algorithm of light beam propagation in high powerlaser system is investigated and the corresponding computational codes are given. It is shown that the number of modulation ring due to the diffraction is related to the size of the pinhole in spatial filter (in terms of the times of diffraction limitation, i.e. TDL) and the Fresnel number of the laser system; for the complex laser system with multi-spatial filters and free space, the system can be investigated by the reciprocal rule of operators.

  1. Fisheries conservation on the high seas: linking conservation physiology and fisheries ecology for the management of large pelagic fishes.

    Science.gov (United States)

    Horodysky, Andrij Z; Cooke, Steven J; Graves, John E; Brill, Richard W

    2016-01-01

    Populations of tunas, billfishes and pelagic sharks are fished at or over capacity in many regions of the world. They are captured by directed commercial and recreational fisheries (the latter of which often promote catch and release) or as incidental catch or bycatch in commercial fisheries. Population assessments of pelagic fishes typically incorporate catch-per-unit-effort time-series data from commercial and recreational fisheries; however, there have been notable changes in target species, areas fished and depth-specific gear deployments over the years that may have affected catchability. Some regional fisheries management organizations take into account the effects of time- and area-specific changes in the behaviours of fish and fishers, as well as fishing gear, to standardize catch-per-unit-effort indices and refine population estimates. However, estimates of changes in stock size over time may be very sensitive to underlying assumptions of the effects of oceanographic conditions and prey distribution on the horizontal and vertical movement patterns and distribution of pelagic fishes. Effective management and successful conservation of pelagic fishes requires a mechanistic understanding of their physiological and behavioural responses to environmental variability, potential for interaction with commercial and recreational fishing gear, and the capture process. The interdisciplinary field of conservation physiology can provide insights into pelagic fish demography and ecology (including environmental relationships and interspecific interactions) by uniting the complementary expertise and skills of fish physiologists and fisheries scientists. The iterative testing by one discipline of hypotheses generated by the other can span the fundamental-applied science continuum, leading to the development of robust insights supporting informed management. The resulting species-specific understanding of physiological abilities and tolerances can help to improve stock

  2. Holographic codes

    CERN Document Server

    Latorre, Jose I

    2015-01-01

    There exists a remarkable four-qutrit state that carries absolute maximal entanglement in all its partitions. Employing this state, we construct a tensor network that delivers a holographic many body state, the H-code, where the physical properties of the boundary determine those of the bulk. This H-code is made of an even superposition of states whose relative Hamming distances are exponentially large with the size of the boundary. This property makes H-codes natural states for a quantum memory. H-codes exist on tori of definite sizes and get classified in three different sectors characterized by the sum of their qutrits on cycles wrapped through the boundaries of the system. We construct a parent Hamiltonian for the H-code which is highly non local and finally we compute the topological entanglement entropy of the H-code.

  3. The Rice coding algorithm achieves high-performance lossless and progressive image compression based on the improving of integer lifting scheme Rice coding algorithm

    Science.gov (United States)

    Jun, Xie Cheng; Su, Yan; Wei, Zhang

    2006-08-01

    In this paper, a modified algorithm was introduced to improve Rice coding algorithm and researches of image compression with the CDF (2,2) wavelet lifting scheme was made. Our experiments show that the property of the lossless image compression is much better than Huffman, Zip, lossless JPEG, RAR, and a little better than (or equal to) the famous SPIHT. The lossless compression rate is improved about 60.4%, 45%, 26.2%, 16.7%, 0.4% on average. The speed of the encoder is faster about 11.8 times than the SPIHT's and its efficiency in time can be improved by 162%. The speed of the decoder is faster about 12.3 times than that of the SPIHT's and its efficiency in time can be rasied about 148%. This algorithm, instead of largest levels wavelet transform, has high coding efficiency when the wavelet transform levels is larger than 3. For the source model of distributions similar to the Laplacian, it can improve the efficiency of coding and realize the progressive transmit coding and decoding.

  4. A highly conserved repeated chromosomal sequence in the radioresistant bacterium Deinococcus radiodurans SARK.

    Science.gov (United States)

    Lennon, E; Gutman, P D; Yao, H L; Minton, K W

    1991-03-01

    A DNA fragment containing a portion of a DNA damage-inducible gene from Deinococcus radiodurans SARK hybridized to numerous fragments of SARK genomic DNA because of a highly conserved repetitive chromosomal element. The element is of variable length, ranging from 150 to 192 bp, depending on the absence or presence of one or two 21-bp sequences located internally. A putative translational start site of the damage-inducible gene is within the reiterated element. The element contains dyad symmetries that suggest modes of transcriptional and/or translational control.

  5. DCHAIN-SP 2001: High energy particle induced radioactivity calculation code

    Energy Technology Data Exchange (ETDEWEB)

    Kai, Tetsuya; Maekawa, Fujio; Kasugai, Yoshimi; Takada, Hiroshi; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kosako, Kazuaki [Sumitomo Atomic Energy Industries, Ltd., Tokyo (Japan)

    2001-03-01

    For the purpose of contribution to safety design calculations for induced radioactivities in the JAERI/KEK high-intensity proton accelerator project facilities, the DCHAIN-SP which calculates the high energy particle induced radioactivity has been updated to DCHAIN-SP 2001. The following three items were improved: (1) Fission yield data are included to apply the code to experimental facility design for nuclear transmutation of long-lived radioactive waste where fissionable materials are treated. (2) Activation cross section data below 20 MeV are revised. In particular, attentions are paid to cross section data of materials which have close relation to the facilities, i.e., mercury, lead and bismuth, and to tritium production cross sections which are important in terms of safety of the facilities. (3) User-interface for input/output data is sophisticated to perform calculations more efficiently than that in the previous version. Information needed for use of the code is attached in Appendices; the DCHAIN-SP 2001 manual, the procedures of installation and execution of DCHAIN-SP, and sample problems. (author)

  6. High Hardware Utilization and Low Memory Block Requirement Decoding of QC-LDPC Codes

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ling; LIU Rongke; HOU Yi; ZHANG Xiaolin

    2012-01-01

    This paper presents a simple yet effective decoding for general quasi-cyclic low-density parity-check (QC-LDPC) codes,which not only achieves high hardware utility efficiency (HUE),but also brings about great memory block reduction without any performance degradation.The main idea is to split the check matrix into several row blocks,then to perform the improved message passing computations sequentially block by block.As the decoding algorithm improves,the sequential tie between the two-phase computations is broken,so that the two-phase computations can be overlapped which bring in high HUE.Two overlapping schemes are also presented,each of which suits a different situation.In addition,an efficient memory arrangement scheme is proposed to reduce the great memory block requirement of the LDPC decoder.As an example,for the 0.4 rate LDPC code selected from Chinese Digital TV Terrestrial Broadcasting (DTTB),our decoding saves over 80% memory blocks compared with the conventional decoding,and the decoder achieves 0.97 HUE.Finally,the 0.4 rate LDPC decoder is implemented on an FPGA device EP2S30 (speed grade-5).Using 8 row processing units,the decoder can achieve a maximum net throughput of 28.5 Mbps at 20 iterations.

  7. Acoustic radiation force impulse (ARFI) imaging of zebrafish embryo by high-frequency coded excitation sequence.

    Science.gov (United States)

    Park, Jinhyoung; Lee, Jungwoo; Lau, Sien Ting; Lee, Changyang; Huang, Ying; Lien, Ching-Ling; Kirk Shung, K

    2012-04-01

    Acoustic radiation force impulse (ARFI) imaging has been developed as a non-invasive method for quantitative illustration of tissue stiffness or displacement. Conventional ARFI imaging (2-10 MHz) has been implemented in commercial scanners for illustrating elastic properties of several organs. The image resolution, however, is too coarse to study mechanical properties of micro-sized objects such as cells. This article thus presents a high-frequency coded excitation ARFI technique, with the ultimate goal of displaying elastic characteristics of cellular structures. Tissue mimicking phantoms and zebrafish embryos are imaged with a 100-MHz lithium niobate (LiNbO₃) transducer, by cross-correlating tracked RF echoes with the reference. The phantom results show that the contrast of ARFI image (14 dB) with coded excitation is better than that of the conventional ARFI image (9 dB). The depths of penetration are 2.6 and 2.2 mm, respectively. The stiffness data of the zebrafish demonstrate that the envelope is harder than the embryo region. The temporal displacement change at the embryo and the chorion is as large as 36 and 3.6 μm. Consequently, this high-frequency ARFI approach may serve as a remote palpation imaging tool that reveals viscoelastic properties of small biological samples.

  8. Development of GAMMA Code and Evaluation for a Very High Temperature gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Chang H; Lim, H.S.; Kim, E.S.; NO, H.C.

    2007-06-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gasses (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. This paper will also include what improvements will be made in the Gamma code for the VHTR.

  9. Application of Gamma code coupled with turbomachinery models for high temperature gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Chang Oh

    2008-02-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-ofcoolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of a toxic gas, CO, and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. GAMMA code is being developed to implement turbomachinery models in the power conversion unit (PCU) and ultimately models associated with the hydrogen plant. Some preliminary results will be described in this paper.

  10. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  11. Skinks (Reptilia: Scincidae) have highly conserved karyotypes as revealed by chromosome painting.

    Science.gov (United States)

    Giovannotti, M; Caputo, V; O'Brien, P C M; Lovell, F L; Trifonov, V; Cerioni, P Nisi; Olmo, E; Ferguson-Smith, M A; Rens, W

    2009-01-01

    Skinks represent the most diversified squamate reptiles with a great variation in body size and form, and are found worldwide in a variety of habitats. Their remarkable diversification has been accompanied by only a few chromosome rearrangements, resulting in highly-conservative chromosomal complements of these lizards. In this study cross-species chromosome painting using Scincus scincus (2n = 32) as the source genome, was used to detect the chromosomal rearrangements and homologies between the following skinks: Chalcides chalcides (2n = 28), C. ocellatus (2n = 28), Eumeces schneideri (2n = 32), Lepidothyris fernandi (2n = 30), Mabuya quinquetaeniata (2n = 32). The results of this study confirmed a high degree of chromosome conservation between these species. The main rearrangements in the studied skinks involve chromosomes 3, 5, 6 and 7 of S. scincus. These subtelocentric chromosomes are homologous to the p and q arms of metacentric pair 3 and 4 in C. chalcides, C. ocellatus, L. fernandi, and M. quinquetaeniata, while they are entirely conserved in E. schneideri. Other rearrangements involve S. scincus 11 in L. fernandi and M. quinquetaeniata, supporting the monophyly of Lygosominae, and one of the chromosomes S. scincus 12-16, in M. quinquetaeniata. In conclusion, our data support the monophyly of Scincidae and confirm that Scincus-Eumeces plus Chalcides do not form a monophyletic clade, suggesting that the Scincus-Eumeces clade is basal to other members of this family. This study represents the first time the whole genome of any reptile species has been used for cross-species chromosome painting to assess chromosomal evolution in this group of vertebrates.

  12. Entropy stable high order discontinuous Galerkin methods with suitable quadrature rules for hyperbolic conservation laws

    Science.gov (United States)

    Chen, Tianheng; Shu, Chi-Wang

    2017-09-01

    It is well known that semi-discrete high order discontinuous Galerkin (DG) methods satisfy cell entropy inequalities for the square entropy for both scalar conservation laws (Jiang and Shu (1994) [39]) and symmetric hyperbolic systems (Hou and Liu (2007) [36]), in any space dimension and for any triangulations. However, this property holds only for the square entropy and the integrations in the DG methods must be exact. It is significantly more difficult to design DG methods to satisfy entropy inequalities for a non-square convex entropy, and/or when the integration is approximated by a numerical quadrature. In this paper, we develop a unified framework for designing high order DG methods which will satisfy entropy inequalities for any given single convex entropy, through suitable numerical quadrature which is specific to this given entropy. Our framework applies from one-dimensional scalar cases all the way to multi-dimensional systems of conservation laws. For the one-dimensional case, our numerical quadrature is based on the methodology established in Carpenter et al. (2014) [5] and Gassner (2013) [19]. The main ingredients are summation-by-parts (SBP) operators derived from Legendre Gauss-Lobatto quadrature, the entropy conservative flux within elements, and the entropy stable flux at element interfaces. We then generalize the scheme to two-dimensional triangular meshes by constructing SBP operators on triangles based on a special quadrature rule. A local discontinuous Galerkin (LDG) type treatment is also incorporated to achieve the generalization to convection-diffusion equations. Extensive numerical experiments are performed to validate the accuracy and shock capturing efficacy of these entropy stable DG methods.

  13. Low-Intensity Agricultural Landscapes in Transylvania Support High Butterfly Diversity: Implications for Conservation

    Science.gov (United States)

    Loos, Jacqueline; Dorresteijn, Ine; Hanspach, Jan; Fust, Pascal; Rakosy, László; Fischer, Joern

    2014-01-01

    European farmland biodiversity is declining due to land use changes towards agricultural intensification or abandonment. Some Eastern European farming systems have sustained traditional forms of use, resulting in high levels of biodiversity. However, global markets and international policies now imply rapid and major changes to these systems. To effectively protect farmland biodiversity, understanding landscape features which underpin species diversity is crucial. Focusing on butterflies, we addressed this question for a cultural-historic landscape in Southern Transylvania, Romania. Following a natural experiment, we randomly selected 120 survey sites in farmland, 60 each in grassland and arable land. We surveyed butterfly species richness and abundance by walking transects with four repeats in summer 2012. We analysed species composition using Detrended Correspondence Analysis. We modelled species richness, richness of functional groups, and abundance of selected species in response to topography, woody vegetation cover and heterogeneity at three spatial scales, using generalised linear mixed effects models. Species composition widely overlapped in grassland and arable land. Composition changed along gradients of heterogeneity at local and context scales, and of woody vegetation cover at context and landscape scales. The effect of local heterogeneity on species richness was positive in arable land, but negative in grassland. Plant species richness, and structural and topographic conditions at multiple scales explained species richness, richness of functional groups and species abundances. Our study revealed high conservation value of both grassland and arable land in low-intensity Eastern European farmland. Besides grassland, also heterogeneous arable land provides important habitat for butterflies. While butterfly diversity in arable land benefits from heterogeneity by small-scale structures, grasslands should be protected from fragmentation to provide

  14. Human cytomegalovirus UL145 gene is highly conserved among clinical strains

    Indian Academy of Sciences (India)

    Zhengrong Sun; Ying Lu; Qiang Ruan; Yaohua Ji; Rong He; Ying Qi; Yanping Ma; Yujing Huang

    2007-09-01

    Human cytomegalovirus (HCMV), a ubiquitous human pathogen, is the leading cause of birth defects in newborns. A region (referred to as UL/b′) present in the Toledo strain of HCMV and low-passage clinical isolates) contains 22 additional genes, which are absent in the highly passaged laboratory strain AD169. One of these genes, UL145 open reading frame (ORF), is located between the highly variable genes UL144 and UL146. To assess the structure of the UL145 gene, the UL145 ORF was amplified by PCR and sequenced from 16 low-passage clinical isolates and 15 non-passage strains from suspected congenitally infected infants. Nine UL145 sequences previously published in the GenBank were used for sequence comparison. The identities of the gene and the similarities of its putative protein among all strains were 95.9–100% and 96.6–100%, respectively. The post-translational modification motifs of the UL145 putative protein in clinical strains were conserved, comprising the protein kinase C phosphorylation motif (PKC) and casein kinase II phosphorylation site (CK-II). We conclude that the structure of the UL145 gene and its putative protein are relatively conserved among clinical strains, irrespective of whether the strains come from patients with different manifestations, from different areas of the world, or were passaged or not in human embryonic lung fibroblast (HELF) cells.

  15. Identification, expression, and characterization of the highly conserved D-xylose isomerase in animals

    Institute of Scientific and Technical Information of China (English)

    Ming Ding; Yigang Teng; Qiuyu Yin; Wei Chen; Fukun Zhao

    2009-01-01

    D-xylose is a necessary sugar for animals. The xylanase from a mollusk, Ampullaria crossean, was previously reported by our laboratory. This xylanase can degrade the xylan into D-xylose. But there is still a gap in our knowledge on its metabolic pathway. The question is how does the xylose enter the pentose pathway? With the help of genomic databases and bioinformatic tools, we found that some animals, such as bacteria, have a highly conserved D-xylose isomerase (EC 5.3.1.5). The xylose isomerase from a sea squirt, Ciona intestinali, was heterogeneously expressed in Escherichia coli and purified to confirm its function. The recombinant enzyme had good thermal stability in the presence of Mg2+. At the optimum temperature and optimum pH environment, its specific activity on D-xylose was 0.331μmol/mg/min. This enzyme exists broadly in many animals, but it disappeared in the genome of Amphibia-like Xenopus laevis. Its sequence was highly conserved. The xylose isomerases from animals are very interesting proteins for the study of evolution.

  16. Effects of the Conservation Reserve Program on Hydrologic Processes in the Southern High Plains

    Science.gov (United States)

    Haacker, E. M.; Smidt, S. J.; Kendall, A. D.; Basso, B.; Hyndman, D. W.

    2015-12-01

    The Southern High Plains Aquifer is a rapidly depleting resource that supports agriculture in parts of New Mexico and the Texas Panhandle. The development of the aquifer has changed the landscape and the water cycle of the region. This study illustrates the evolving patterns of land use and the effects of cultivation, from irrigated to dryland farming to the countermanding influence of the Conservation Reserve Program (CRP). Previous research indicates that greater recharge rates occur under cultivated land in the Southern High Plains than under unbroken soil: the transition to cultivation causes increased recharge, under both dryland and irrigated management, though most recharge still occurs through playa lakes. The Conservation Reserve Program takes land out of crop production, replacing the land cover with something more like the natural ecosystem. This may decrease recharge below fields, and reduce runoff that feeds playa lakes; or, CRP may help stabilize playa lakes, increasing recharge. Changes to the water cycle are investigated at the field scale using the System Approach to Land Use Sustainability (SALUS) crop model, and at the regional scale with the Landscape Hydrology Model (LHM), and compared with historical data and water table elevations.

  17. On the Way to Future's High Energy Particle Physics Transport Code

    CERN Document Server

    Bíró, Gábor; Futó, Endre

    2015-01-01

    High Energy Physics (HEP) needs a huge amount of computing resources. In addition data acquisition, transfer, and analysis require a well developed infrastructure too. In order to prove new physics disciplines it is required to higher the luminosity of the accelerator facilities, which produce more-and-more data in the experimental detectors. Both testing new theories and detector R&D are based on complex simulations. Today have already reach that level, the Monte Carlo detector simulation takes much more time than real data collection. This is why speed up of the calculations and simulations became important in the HEP community. The Geant Vector Prototype (GeantV) project aims to optimize the most-used particle transport code applying parallel computing and to exploit the capabilities of the modern CPU and GPU architectures as well. With the maximized concurrency at multiple levels the GeantV is intended to be the successor of the Geant4 particle transport code that has been used since two decades succe...

  18. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    Science.gov (United States)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  19. Secure Communications in High Speed Fiber Optical Networks Using Code Division Multiple Access (CDMA) Transmission

    Energy Technology Data Exchange (ETDEWEB)

    Han, I; Bond, S; Welty, R; Du, Y; Yoo, S; Reinhardt, C; Behymer, E; Sperry, V; Kobayashi, N

    2004-02-12

    This project is focused on the development of advanced components and system technologies for secure data transmission on high-speed fiber optic data systems. This work capitalizes on (1) a strong relationship with outstanding faculty at the University of California-Davis who are experts in high speed fiber-optic networks, (2) the realization that code division multiple access (CDMA) is emerging as a bandwidth enhancing technique for fiber optic networks, (3) the realization that CDMA of sufficient complexity forms the basis for almost unbreakable one-time key transmissions, (4) our concepts for superior components for implementing CDMA, (5) our expertise in semiconductor device processing and (6) our Center for Nano and Microtechnology, which is where the majority of the experimental work was done. Here we present a novel device concept, which will push the limits of current technology, and will simultaneously solve system implementation issues by investigating new state-of-the-art fiber technologies. This will enable the development of secure communication systems for the transmission and reception of messages on deployed commercial fiber optic networks, through the CDMA phase encoding of broad bandwidth pulses. CDMA technology has been developed as a multiplexing technology, much like wavelength division multiplexing (WDM) or time division multiplexing (TDM), to increase the potential number of users on a given communication link. A novel application of the techniques created for CDMA is to generate secure communication through physical layer encoding. Physical layer encoding devices are developed which utilize semiconductor waveguides with fast carrier response times to phase encode spectral components of a secure signal. Current commercial technology, most commonly a spatial light modulator, allows phase codes to be changed at rates of only 10's of Hertz ({approx}25ms response). The use of fast (picosecond to nanosecond) carrier dynamics of semiconductors

  20. Developing High Quality Decision-Making Discussions about Biological Conservation in a Normal Classroom Setting

    Science.gov (United States)

    Grace, Marcus

    2009-01-01

    The conservation of biodiversity is an important socio-scientific issue that is often regarded as a precondition to sustainable development. The foundation for citizens' understanding of conservation issues can be laid down in formal school education. This research focuses on decision-making discussions about biological conservation issues among…

  1. Pedestal-type understructure and an electric heat recovery system make Boston office building high achiever in land use and energy conservation

    Energy Technology Data Exchange (ETDEWEB)

    1976-01-01

    Land use and energy conservation goals are responsible for the unusual pedestal base of Boston's new Fiduciary Trust Building, located in one of the city's oldest sections. The area surrounding South Station and designated for renewal is plagued with a complex traffic situation on the surface and an underground maze of utility lines and conduits and subway tunnels, all of which combined to make innovative construction necessary. Caissons that by-pass underground obstructions support vertical steelwork through a flaring three-story pedestal. Cantilevered trusses provide a platform for the tower floors and allow versatile use of space with esthetically pleasing lines. An original glass-face plan was replaced with 35 percent tinted glass, 20 percent insulation-backed glass, and 45 percent precast concrete in order to conserve energy. Individualized lighting control was accomplished by installing 1000 wall switches to control the 3000 fixtures. Standardized 35-watt four-foot fluorescent tubes and a High Voltage Alternating Current (HVAC) air system allow further energy savings. A stringent new building code is credited with inspiring many of the conservation features. Details of the construction and heat recovery systems include a design summary of the building's construction specifications. (DCK)

  2. Zero-tolerance biosecurity protects high-conservation-value island nature reserve.

    Science.gov (United States)

    Scott, John K; McKirdy, Simon J; van der Merwe, Johann; Green, Roy; Burbidge, Andrew A; Pickles, Greg; Hardie, Darryl C; Morris, Keith; Kendrick, Peter G; Thomas, Melissa L; Horton, Kristin L; O'Connor, Simon M; Downs, Justin; Stoklosa, Richard; Lagdon, Russell; Marks, Barbara; Nairn, Malcolm; Mengersen, Kerrie

    2017-04-10

    Barrow Island, north-west coast of Australia, is one of the world's significant conservation areas, harboring marsupials that have become extinct or threatened on mainland Australia as well as a rich diversity of plants and animals, some endemic. Access to construct a Liquefied Natural Gas (LNG) plant, Australia's largest infrastructure development, on the island was conditional on no non-indigenous species (NIS) becoming established. We developed a comprehensive biosecurity system to protect the island's biodiversity. From 2009 to 2015 more than 0.5 million passengers and 12.2 million tonnes of freight were transported to the island under the biosecurity system, requiring 1.5 million hrs of inspections. No establishments of NIS were detected. We made four observations that will assist development of biosecurity systems. Firstly, the frequency of detections of organisms corresponded best to a mixture log-normal distribution including the high number of zero inspections and extreme values involving rare incursions. Secondly, comprehensive knowledge of the island's biota allowed estimation of false positive detections (62% native species). Thirdly, detections at the border did not predict incursions on the island. Fourthly, the workforce detected more than half post-border incursions (59%). Similar approaches can and should be implemented for all areas of significant conservation value.

  3. A highly conserved program of neuronal microexons is misregulated in autistic brains.

    Science.gov (United States)

    Irimia, Manuel; Weatheritt, Robert J; Ellis, Jonathan D; Parikshak, Neelroop N; Gonatopoulos-Pournatzis, Thomas; Babor, Mariana; Quesnel-Vallières, Mathieu; Tapial, Javier; Raj, Bushra; O'Hanlon, Dave; Barrios-Rodiles, Miriam; Sternberg, Michael J E; Cordes, Sabine P; Roth, Frederick P; Wrana, Jeffrey L; Geschwind, Daniel H; Blencowe, Benjamin J

    2014-12-18

    Alternative splicing (AS) generates vast transcriptomic and proteomic complexity. However, which of the myriad of detected AS events provide important biological functions is not well understood. Here, we define the largest program of functionally coordinated, neural-regulated AS described to date in mammals. Relative to all other types of AS within this program, 3-15 nucleotide "microexons" display the most striking evolutionary conservation and switch-like regulation. These microexons modulate the function of interaction domains of proteins involved in neurogenesis. Most neural microexons are regulated by the neuronal-specific splicing factor nSR100/SRRM4, through its binding to adjacent intronic enhancer motifs. Neural microexons are frequently misregulated in the brains of individuals with autism spectrum disorder, and this misregulation is associated with reduced levels of nSR100. The results thus reveal a highly conserved program of dynamic microexon regulation associated with the remodeling of protein-interaction networks during neurogenesis, the misregulation of which is linked to autism.

  4. Molecular mediators for raft-dependent endocytosis of syndecan-1, a highly conserved, multifunctional receptor.

    Science.gov (United States)

    Chen, Keyang; Williams, Kevin Jon

    2013-05-17

    Endocytosis via rafts has attracted considerable recent interest, but the molecular mediators remain incompletely characterized. Here, we focused on the syndecan-1 heparan sulfate proteoglycan, a highly conserved, multifunctional receptor that we previously showed to undergo raft-dependent endocytosis upon clustering. Alanine scanning mutagenesis of three to five consecutive cytoplasmic residues at a time revealed that a conserved juxtamembrane motif, MKKK, was the only region required for efficient endocytosis after clustering. Endocytosis of clustered syndecan-1 occurs in two phases, each requiring a kinase and a corresponding cytoskeletal partner. In the initial phase, ligands trigger rapid MKKK-dependent activation of ERK and the localization of syndecan-1 into rafts. Activation of ERK drives the dissociation of syndecan-1 from α-tubulin, a molecule that may act as an anchor for syndecan-1 at the plasma membrane in the basal state. In the second phase, Src family kinases phosphorylate tyrosyl residues within the transmembrane and cytoplasmic regions of syndecan-1, a process that also requires MKKK. Tyrosine phosphorylation of syndecan-1 triggers the robust recruitment of cortactin, which we found to be an essential mediator of efficient actin-dependent endocytosis. These findings represent the first detailed characterization of the molecular events that drive endocytosis of a raft-dependent receptor and identify a novel endocytic motif, MKKK. Moreover, the results provide new tools to study syndecan function and regulation during uptake of its biologically and medically important ligands, such as HIV-1, atherogenic postprandial remnant lipoproteins, and molecules implicated in Alzheimer disease.

  5. Human Cytomegalovirus UL138 Open Reading Frame Is Highly Conserved in Clinical Strains

    Institute of Scientific and Technical Information of China (English)

    Ying Qi; Rong He; Yan-ping Ma; Zheng-rong Sun; Yao-hua Ji; Qiang Ruan

    2009-01-01

    To investigate the variability of human cytomegalovirus (HCMV) UL138 open reading flame (ORF) in clinical strains.Methods HCMV UL138 ORF was amplified by polymerase chain reaction (PCR) and PCR amplification products were sequenced directly, and the data were analyzed in 19 clinical strains.Results UL138 ORF in all 30 clinical strains was amplified successfully. Compared with that of Toledo strain, the nucleotide and amino acid sequence identities of UL138 ORF in all strains were 97.41% to 99.41% and 98.24% to 99.42%, respectively. All of the nucleotide mutations were substitutions. The spatial structure and post-translational modification sites of UL138 encoded proteins were conserved. The result of phylogenetic tree showed that HCMV UL138 sequence variations were not definitely related with different clinical symptoms.Conclusion HCMV UL138 ORF in clinical strains is high conservation, which might be helpful for UL138 encoded protein to play a role in latent infection of HCMV.

  6. High regional genetic differentiation of an endangered relict plant Craigia yunnanensis and implications for its conservation

    Directory of Open Access Journals (Sweden)

    Jing Yang

    2016-10-01

    Full Text Available Of the genus Craigia, widespread in the Tertiary, only two relict species survived to modern times. One species is now possibly extinct and the other one, Craigia yunnanensis, is severely endangered. Extensive surveys have located six C. yunnanensis populations in Yunnan province, southwest China. Using fluorescent amplified fragment length polymorphism (AFLP, the genetic diversity and population structure of these populations were examined. It was found that genetic diversity of C. yunnanensis was moderate at the species level, but low at regional and population levels. Analysis of population structure showed significant genetic differentiation between Wenshan and Dehong regions, apparently representing two geographically isolated for long time refuges. There are also clear indications of isolation between populations, which, together with anthropogenically caused decline of population size, will lead to general loss of the species genetic variation with subsequent loss of adaptive potential. To conserve the genetic integrity of C. yunnanensis, we recommend that ex-situ conservation should include representative samples from every population of the two differentiated regions (e.g. Wenshan and Dehong. The crosses between individuals originated from different regions should be avoided because of a high risk of outbreeding depression. As all the extant populations of C. yunnanensis are in unprotected areas with strong anthropogenic impact, there is no alternative to reintroduction of C. yunnanensis into suitable protected locations.

  7. Nifty Native Implemented Functions: low-level meets high-level code

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Erlang Native Implemented Functions (NIFs) allow developers to implement functions in C (or C++) rather than Erlang. NIFs are useful for integrating high performance or legacy code in Erlang applications. The talk will cover how to implement NIFs, use cases, and common pitfalls when employing them. Further, we will discuss how and why Erlang applications, such as Riak, use NIFs. About the speaker Ian Plosker is the Technical Lead, International Operations at Basho Technologies, the makers of the open source database Riak. He has been developing software professionally for 10 years and programming since childhood. Prior to working at Basho, he developed everything from CMS to bioinformatics platforms to corporate competitive intelligence management systems. At Basho, he's been helping customers be incredibly successful using Riak.

  8. A Particle In Cell code development for high current ion beam transport and plasma simulations

    CERN Document Server

    Joshi, N

    2016-01-01

    A simulation package employing a Particle in Cell (PIC) method is developed to study the high current beam transport and the dynamics of plasmas. This package includes subroutines those are suited for various planned projects at University of Frankfurt. In the framework of the storage ring project (F8SR) the code was written to describe the beam optics in toroidal magnetic fields. It is used to design an injection system for a ring with closed magnetic field lines. The generalized numerical model, in Cartesian coordinates is used to describe the intense ion beam transport through the chopper system in the low energy beam section of the FRANZ project. Especially for the chopper system, the Poisson equation is implemented with irregular geometries. The Particle In Cell model is further upgraded with a Monte Carlo Collision subroutine for simulation of plasma in the volume type ion source.

  9. Coded Mask Imaging of High Energy X-rays with CZT Detectors

    Science.gov (United States)

    Matteson, J. L.; Dowkontt, P. F.; Duttweiler, F.; Heindl, W. A.; Hink, P. L.; Huszar, G. L.; Kalemci, E.; Leblanc, P. C.; Rothschild, R. E.; Skelton, R. T.; Slavis, K. R.; Stephan, E. A.

    1998-12-01

    Coded mask imagers are appropriate for important objectives of high energy X-ray astronomy, e.g., gamma- ray burst localization, all-sky monitors and surveys, and deep surveys of limited regions. We report results from a coded mask imager developed to establish the proof-of-concept for this technique with CZT detectors. The detector is 2 mm thick with orthogonal crossed strip readout and an advanced electrode design to improve the energy resolution. Each detector face has 22 strip electrodes, and the strip pitch and pixel size are 500 microns. ASIC readout is used and the energy resolution varies from 3 to 6 keV FWHM over the 14 to 184 keV keV range. A coded mask with 2 x 2 cycles of a 23 x 23 MURA pattern (860 micron unit cell) was built from 600 micron thick tantalum to provide good X-ray modulation up to 200 keV. The detector, mask, and a tiny Gd-153 source of 41 keV X-rays were positioned with a spacing that caused the mask cells in the shadowgram to have a projected size of 1300 microns at the detector. Multiple detector positions were used to measure the shadowgram of a full mask cycle and this was recorded with 100 percent modulation transfer by the detector, due to its factor of 2.6 oversampling of the mask unit cell, and very high strip-to-strip selectivity and spatial accuracy. Deconvolution of the shadowgram produced a correlation image in which the source was detected as a 76-sigma peak with the correct FWHM and base diameter. Off-source image pixels had gaussian fluctuations that agree closely with the measurement statistics. Off-source image defects such as might be produced by systematic effects were too small to be seen and limited to <0.5 percent of the source peak. These results were obtained with the "raw" shadowgram and image; no "flat fielding" corrections were used.

  10. An assessment of high carbon stock and high conservation value approaches to sustainable oil palm cultivation in Gabon

    Science.gov (United States)

    Austin, Kemen G.; Lee, Michelle E.; Clark, Connie; Forester, Brenna R.; Urban, Dean L.; White, Lee; Kasibhatla, Prasad S.; Poulsen, John R.

    2017-01-01

    Industrial-scale oil palm cultivation is rapidly expanding in Gabon, where it has the potential to drive economic growth, but also threatens forest, biodiversity and carbon resources. The Gabonese government is promoting an ambitious agricultural expansion strategy, while simultaneously committing to minimize negative environmental impacts of oil palm agriculture. This study estimates the extent and location of suitable land for oil palm cultivation in Gabon, based on an analysis of recent trends in plantation permitting. We use the resulting suitability map to evaluate two proposed approaches to minimizing negative environmental impacts: a High Carbon Stock (HCS) approach, which emphasizes forest protection and climate change mitigation, and a High Conservation Value (HCV) approach, which focuses on safeguarding biodiversity and ecosystems. We quantify the forest area, carbon stock, and biodiversity resources protected under each approach, using newly developed maps of priority species distributions and forest biomass for Gabon. We find 2.7–3.9 Mha of suitable or moderately suitable land that avoid HCS areas, 4.4 million hectares (Mha) that avoid HCV areas, and 1.2–1.7 Mha that avoid both. This suggests that Gabon’s oil palm production target could likely be met without compromising important ecosystem services, if appropriate safeguards are put in place. Our analysis improves understanding of suitability for oil palm in Gabon, determines how conservation strategies align with national targets for oil palm production, and informs national land use planning.

  11. Perception and coding of high-frequency spectral notches: potential implications for sound localization.

    Science.gov (United States)

    Alves-Pinto, Ana; Palmer, Alan R; Lopez-Poveda, Enrique A

    2014-01-01

    The interaction of sound waves with the human pinna introduces high-frequency notches (5-10 kHz) in the stimulus spectrum that are thought to be useful for vertical sound localization. A common view is that these notches are encoded as rate profiles in the auditory nerve (AN). Here, we review previously published psychoacoustical evidence in humans and computer-model simulations of inner hair cell responses to noises with and without high-frequency spectral notches that dispute this view. We also present new recordings from guinea pig AN and "ideal observer" analyses of these recordings that suggest that discrimination between noises with and without high-frequency spectral notches is probably based on the information carried in the temporal pattern of AN discharges. The exact nature of the neural code involved remains nevertheless uncertain: computer model simulations suggest that high-frequency spectral notches are encoded in spike timing patterns that may be operant in the 4-7 kHz frequency regime, while "ideal observer" analysis of experimental neural responses suggest that an effective cue for high-frequency spectral discrimination may be based on sampling rates of spike arrivals of AN fibers using non-overlapping time binwidths of between 4 and 9 ms. Neural responses show that sensitivity to high-frequency notches is greatest for fibers with low and medium spontaneous rates than for fibers with high spontaneous rates. Based on this evidence, we conjecture that inter-subject variability at high-frequency spectral notch detection and, consequently, at vertical sound localization may partly reflect individual differences in the available number of functional medium- and low-spontaneous-rate fibers.

  12. Perception and coding of high-frequency spectral notches: Potential implications for sound localization

    Directory of Open Access Journals (Sweden)

    Ana eAlves-Pinto

    2014-05-01

    Full Text Available The interaction of sound waves with the human pinna introduces high-frequency notches (5-10 kHz in the stimulus spectrum that are thought to be useful for vertical sound localization. A common view is that these notches are encoded as rate profiles in the auditory nerve (AN. Here, we review previously published psychoacoustical evidence in humans and computer-model simulations of inner hair cell responses to noises with and without high-frequency spectral notches that dispute this view. We also present new recordings from guinea pig AN and ‘ideal observer’ analyses of these recordings that suggest that discrimination between noises with and without high-frequency spectral notches is probably based on the information carried in the temporal pattern of AN discharges. The exact nature of the neural code involved remains nevertheless uncertain: computer model simulations suggest that high-frequency spectral notches are encoded in spike timing patterns that may be operant in the 4-7 kHz frequency regime, while ‘ideal observer’ analysis of experimental neural responses suggest that an effective cue for high-frequency spectral discrimination may be based on sampling rates of spike arrivals of AN fibers using non-overlapping time binwidths of between 4 and 9 ms. Neural responses show that sensitivity to high-frequency notches is greatest for fibers with low and medium spontaneous rates than for fibers with high spontaneous rates. Based on this evidence, we conjecture that inter-subject variability at high-frequency spectral notch detection and, consequently, at vertical sound localization may partly reflect individual differences in the available number of functional medium- and low-spontaneous-rate fibers.

  13. Use of ancient sedimentary DNA as a novel conservation tool for high-altitude tropical biodiversity.

    Science.gov (United States)

    Boessenkool, Sanne; McGlynn, Gayle; Epp, Laura S; Taylor, David; Pimentel, Manuel; Gizaw, Abel; Nemomissa, Sileshi; Brochmann, Christian; Popp, Magnus

    2014-04-01

    Conservation of biodiversity may in the future increasingly depend upon the availability of scientific information to set suitable restoration targets. In traditional paleoecology, sediment-based pollen provides a means to define preanthropogenic impact conditions, but problems in establishing the exact provenance and ecologically meaningful levels of taxonomic resolution of the evidence are limiting. We explored the extent to which the use of sedimentary ancient DNA (sedaDNA) may complement pollen data in reconstructing past alpine environments in the tropics. We constructed a record of afro-alpine plants retrieved from DNA preserved in sediment cores from 2 volcanic crater sites in the Albertine Rift, eastern Africa. The record extended well beyond the onset of substantial anthropogenic effects on tropical mountains. To ensure high-quality taxonomic inference from the sedaDNA sequences, we built an extensive DNA reference library covering the majority of the afro-alpine flora, by sequencing DNA from taxonomically verified specimens. Comparisons with pollen records from the same sediment cores showed that plant diversity recovered with sedaDNA improved vegetation reconstructions based on pollen records by revealing both additional taxa and providing increased taxonomic resolution. Furthermore, combining the 2 measures assisted in distinguishing vegetation change at different geographic scales; sedaDNA almost exclusively reflects local vegetation, whereas pollen can potentially originate from a wide area that in highlands in particular can span several ecozones. Our results suggest that sedaDNA may provide information on restoration targets and the nature and magnitude of human-induced environmental changes, including in high conservation priority, biodiversity hotspots, where understanding of preanthropogenic impact (or reference) conditions is highly limited.

  14. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta;

    2011-01-01

    We demonstrate that, by jointly optimizing video coding and radio-over-fibre transmission, we extend the reach of 60-GHz wireless distribution of high-quality high-definition video satisfying low complexity and low delay constraints, while preserving superb video quality.......We demonstrate that, by jointly optimizing video coding and radio-over-fibre transmission, we extend the reach of 60-GHz wireless distribution of high-quality high-definition video satisfying low complexity and low delay constraints, while preserving superb video quality....

  15. Efficient Symbol Sorting for High Intermediate Recovery Rate of LT Codes

    CERN Document Server

    Talari, Ali; Rahnavard, Nazanin

    2010-01-01

    LT codes are modern and efficient rateless forward error correction (FEC) codes with close to channel capacity performance. Nevertheless, in intermediate range where the number of received encoded symbols is less than the number of source symbols, LT codes have very low recovery rates. In this paper, we propose a novel algorithm which significantly increases the intermediate recovery rate of LT codes, while it preserves the codes' close to channel capacity performance. To increase the intermediate recovery rate, our proposed algorithm rearranges the transmission order of the encoded symbols exploiting their structure, their transmission history, and an estimate of the channel's erasure rate. We implement our algorithm for conventional LT codes, and numerically evaluate its performance.

  16. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    Science.gov (United States)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  17. A new computer code to evaluate detonation performance of high explosives and their thermochemical properties, part I.

    Science.gov (United States)

    Keshavarz, Mohammad Hossein; Motamedoshariati, Hadi; Moghayadnia, Reza; Nazari, Hamid Reza; Azarniamehraban, Jamshid

    2009-12-30

    In this paper a new simple user-friendly computer code, in Visual Basic, has been introduced to evaluate detonation performance of high explosives and their thermochemical properties. The code is based on recently developed methods to obtain thermochemical and performance parameters of energetic materials, which can complement the computer outputs of the other thermodynamic chemical equilibrium codes. It can predict various important properties of high explosive including velocity of detonation, detonation pressure, heat of detonation, detonation temperature, Gurney velocity, adiabatic exponent and specific impulse of high explosives. It can also predict detonation performance of aluminized explosives that can have non-ideal behaviors. This code has been validated with well-known and standard explosives and compared the predicted results, where the predictions of desired properties were possible, with outputs of some computer codes. A large amount of data for detonation performance on different classes of explosives from C-NO(2), O-NO(2) and N-NO(2) energetic groups have also been generated and compared with well-known complex code BKW.

  18. A new computer code to evaluate detonation performance of high explosives and their thermochemical properties, part I

    Energy Technology Data Exchange (ETDEWEB)

    Keshavarz, Mohammad Hossein, E-mail: mhkeshavarz@mut-es.ac.ir [Department of Chemistry, Malek-ashtar University of Technology, Shahin-shahr P.O. Box 83145/115 (Iran, Islamic Republic of); Motamedoshariati, Hadi; Moghayadnia, Reza; Nazari, Hamid Reza; Azarniamehraban, Jamshid [Department of Chemistry, Malek-ashtar University of Technology, Shahin-shahr P.O. Box 83145/115 (Iran, Islamic Republic of)

    2009-12-30

    In this paper a new simple user-friendly computer code, in Visual Basic, has been introduced to evaluate detonation performance of high explosives and their thermochemical properties. The code is based on recently developed methods to obtain thermochemical and performance parameters of energetic materials, which can complement the computer outputs of the other thermodynamic chemical equilibrium codes. It can predict various important properties of high explosive including velocity of detonation, detonation pressure, heat of detonation, detonation temperature, Gurney velocity, adiabatic exponent and specific impulse of high explosives. It can also predict detonation performance of aluminized explosives that can have non-ideal behaviors. This code has been validated with well-known and standard explosives and compared the predicted results, where the predictions of desired properties were possible, with outputs of some computer codes. A large amount of data for detonation performance on different classes of explosives from C-NO{sub 2}, O-NO{sub 2} and N-NO{sub 2} energetic groups have also been generated and compared with well-known complex code BKW.

  19. High Resolution Euler Solvers Based on the Space-Time Conservation Element and Solution Element Method

    Science.gov (United States)

    Wang, Xiao-Yen; Chow, Chuen-Yen; Chang, Sin-Chung

    1996-01-01

    The I-D, quasi I-D and 2-D Euler solvers based on the method of space-time conservation element and solution element are used to simulate various flow phenomena including shock waves, Mach stem, contact surface, expansion waves, and their intersections and reflections. Seven test problems are solved to demonstrate the capability of this method for handling unsteady compressible flows in various configurations. Numerical results so obtained are compared with exact solutions and/or numerical solutions obtained by schemes based on other established computational techniques. Comparisons show that the present Euler solvers can generate highly accurate numerical solutions to complex flow problems in a straightforward manner without using any ad hoc techniques in the scheme.

  20. Mapping the transcription repressive domain in the highly conserved human gene hnulp1

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    HNULP1,a new member of the basic helixloop-helix transcription factors,contains a DUF654 domain in its C-terminus and is highly conserved from Drosophilae,yeast,zebrafish to mouse.The function of this motif,however,is currently unknown.In this research,we fused five deletion fragments of the DUF654 domain to the GAL4 DNA-binding domain and then co-transfected with plasmids L8G5-Luc and VP-16.The analysis of the GAL4 luciferase reporter gene indicated that fragments from 228 to 407 amino acids in the DUF654 domain had a strong transcription repression activity.Therefore,this study lays a solid foundation for research on the mechanism of hnulp1 transcriptional regulation and the function of the DUF654 domain.

  1. High-order conservative reconstruction schemes for finite volume methods in cylindrical and spherical coordinates

    CERN Document Server

    Mignone, A

    2014-01-01

    High-order reconstruction schemes for the solution of hyperbolic conservation laws in orthogonal curvilinear coordinates are revised in the finite volume approach. The formulation employs a piecewise polynomial approximation to the zone-average values to reconstruct left and right interface states from within a computational zone to arbitrary order of accuracy by inverting a Vandermonde-like linear system of equations with spatially varying coefficients. The approach is general and can be used on uniform and non-uniform meshes although explicit expressions are derived for polynomials from second to fifth degree in cylindrical and spherical geometries with uniform grid spacing. It is shown that, in regions of large curvature, the resulting expressions differ considerably from their Cartesian counterparts and that the lack of such corrections can severely degrade the accuracy of the solution close to the coordinate origin. Limiting techniques and monotonicity constraints are revised for conventional reconstruct...

  2. Highly Conserved Elements and Chromosome Structure Evolution in Mitochondrial Genomes in Ciliates

    Directory of Open Access Journals (Sweden)

    Roman A. Gershgorin

    2017-02-01

    Full Text Available Recent phylogenetic analyses are incorporating ultraconserved elements (UCEs and highly conserved elements (HCEs. Models of evolution of the genome structure and HCEs initially faced considerable algorithmic challenges, which gave rise to (often unnatural constraints on these models, even for conceptually simple tasks such as the calculation of distance between two structures or the identification of UCEs. In our recent works, these constraints have been addressed with fast and efficient solutions with no constraints on the underlying models. These approaches have led us to an unexpected result: for some organelles and taxa, the genome structure and HCE set, despite themselves containing relatively little information, still adequately resolve the evolution of species. We also used the HCE identification to search for promoters and regulatory elements that characterize the functional evolution of the genome.

  3. Particle-number conserving analysis of the high-spin structure of $^{159}$Ho

    CERN Document Server

    Zhang, Zhen-Hua

    2016-01-01

    The high-spin rotational bands in odd-$Z$ nuclei $^{159}$Ho ($Z=67$) are investigated using the cranked shell model with the pairing correlations treated by a particle-number conserving method, in which the blocking effects are taken into account exactly. The experimental moments of inertia and alignments and their variations with the rotational frequency $\\hbar\\omega$ are reproduced very well by the calculations. The splitting between the signature partners of the yrast band $7/2^-[523]$ is discussed and the splitting of the excited band $7/2^+[404]$ above $\\hbar\\omega \\sim 0.30$~MeV is predicted due to the level crossing with $1/2^+[411]$. The calculated $B(E2)$ transition probabilities are also suggested for future experiments.

  4. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  5. RRTMGP: A High-Performance Broadband Radiation Code for the Next Decade

    Science.gov (United States)

    2015-09-30

    the PI of this project, and his team at AER includes programmers with experience coding for modern computer architectures, including the recent GPU ...Supercomputer Center (CSCS) in Lugano will be developing a GPU version (OpenACC) of this code for use in the ICON LES model. This version will provide a...significant foundation for the GPU version of our code that is a deliverable for this project. Andre Wehe of AER will spend the first week in November

  6. Highly conserved type 1 pili promote enterotoxigenic E. coli pathogen-host interactions.

    Directory of Open Access Journals (Sweden)

    Alaullah Sheikh

    2017-05-01

    Full Text Available Enterotoxigenic Escherichia coli (ETEC, defined by their elaboration of heat-labile (LT and/or heat-stable (ST enterotoxins, are a common cause of diarrheal illness in developing countries. Efficient delivery of these toxins requires ETEC to engage target host enterocytes. This engagement is accomplished using a variety of pathovar-specific and conserved E. coli adhesin molecules as well as plasmid encoded colonization factors. Some of these adhesins undergo significant transcriptional modulation as ETEC encounter intestinal epithelia, perhaps suggesting that they cooperatively facilitate interaction with the host. Among genes significantly upregulated on cell contact are those encoding type 1 pili. We therefore investigated the role played by these pili in facilitating ETEC adhesion, and toxin delivery to model intestinal epithelia. We demonstrate that type 1 pili, encoded in the E. coli core genome, play an essential role in ETEC virulence, acting in concert with plasmid-encoded pathovar specific colonization factor (CF fimbriae to promote optimal bacterial adhesion to cultured intestinal epithelium (CIE and to epithelial monolayers differentiated from human small intestinal stem cells. Type 1 pili are tipped with the FimH adhesin which recognizes mannose with stereochemical specificity. Thus, enhanced production of highly mannosylated proteins on intestinal epithelia promoted FimH-mediated ETEC adhesion, while conversely, interruption of FimH lectin-epithelial interactions with soluble mannose, anti-FimH antibodies or mutagenesis of fimH effectively blocked ETEC adhesion. Moreover, fimH mutants were significantly impaired in delivery of both heat-stable and heat-labile toxins to the target epithelial cells in vitro, and these mutants were substantially less virulent in rabbit ileal loop assays, a classical model of ETEC pathogenesis. Collectively, our data suggest that these highly conserved pili play an essential role in virulence of these

  7. The Student Opinions Concerning Freedom of Dress Code Including High Schools Among Others

    Directory of Open Access Journals (Sweden)

    Ahmet Akbaba

    2014-04-01

    Full Text Available This study aimed at examining the opinions of high school students concerning the much-debated dress code applied to primary school, middle school, and high school students and thought to affect quality in education, and revealing the importance of the issue as well as its financial, social, and pedagogical dimensions. The research is a descriptive study in survey model. A Likert-type questionnaire developed by the researcher was used in the present study – a descriptive study in survey model – as data collection tool. The questionnaire was administered to 350 students attending 15 high schools located in the central district of Van province. Based on the research results, it can be concluded that free dress is a right and a requirement of the age even though it causes extra cost, takes the time of students, or brings out the rich-poor discrimination. As a result of the study it can be said that freedom of dress is a right in essence.

  8. High-Fidelity Buckling Analysis of Composite Cylinders Using the STAGS Finite Element Code

    Science.gov (United States)

    Hilburger, Mark W.

    2014-01-01

    Results from previous shell buckling studies are presented that illustrate some of the unique and powerful capabilities in the STAGS finite element analysis code that have made it an indispensable tool in structures research at NASA over the past few decades. In particular, prototypical results from the development and validation of high-fidelity buckling simulations are presented for several unstiffened thin-walled compression-loaded graphite-epoxy cylindrical shells along with a discussion on the specific methods and user-defined subroutines in STAGS that are used to carry out the high-fidelity simulations. These simulations accurately account for the effects of geometric shell-wall imperfections, shell-wall thickness variations, local shell-wall ply-gaps associated with the fabrication process, shell-end geometric imperfections, nonuniform applied end loads, and elastic boundary conditions. The analysis procedure uses a combination of nonlinear quasi-static and transient dynamic solution algorithms to predict the prebuckling and unstable collapse response characteristics of the cylinders. Finally, the use of high-fidelity models in the development of analysis-based shell-buckling knockdown (design) factors is demonstrated.

  9. Evaluation of in-network adaptation of scalable high efficiency video coding (SHVC) in mobile environments

    Science.gov (United States)

    Nightingale, James; Wang, Qi; Grecos, Christos; Goma, Sergio

    2014-02-01

    High Efficiency Video Coding (HEVC), the latest video compression standard (also known as H.265), can deliver video streams of comparable quality to the current H.264 Advanced Video Coding (H.264/AVC) standard with a 50% reduction in bandwidth. Research into SHVC, the scalable extension to the HEVC standard, is still in its infancy. One important area for investigation is whether, given the greater compression ratio of HEVC (and SHVC), the loss of packets containing video content will have a greater impact on the quality of delivered video than is the case with H.264/AVC or its scalable extension H.264/SVC. In this work we empirically evaluate the layer-based, in-network adaptation of video streams encoded using SHVC in situations where dynamically changing bandwidths and datagram loss ratios require the real-time adaptation of video streams. Through the use of extensive experimentation, we establish a comprehensive set of benchmarks for SHVC-based highdefinition video streaming in loss prone network environments such as those commonly found in mobile networks. Among other results, we highlight that packet losses of only 1% can lead to a substantial reduction in PSNR of over 3dB and error propagation in over 130 pictures following the one in which the loss occurred. This work would be one of the earliest studies in this cutting-edge area that reports benchmark evaluation results for the effects of datagram loss on SHVC picture quality and offers empirical and analytical insights into SHVC adaptation to lossy, mobile networking conditions.

  10. Fast Binary Coding for the Scene Classification of High-Resolution Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    Fan Hu

    2016-06-01

    Full Text Available Scene classification of high-resolution remote sensing (HRRS imagery is an important task in the intelligent processing of remote sensing images and has attracted much attention in recent years. Although the existing scene classification methods, e.g., the bag-of-words (BOW model and its variants, can achieve acceptable performance, these approaches strongly rely on the extraction of local features and the complicated coding strategy, which are usually time consuming and demand much expert effort. In this paper, we propose a fast binary coding (FBC method, to effectively generate efficient discriminative scene representations of HRRS images. The main idea is inspired by the unsupervised feature learning technique and the binary feature descriptions. More precisely, equipped with the unsupervised feature learning technique, we first learn a set of optimal “filters” from large quantities of randomly-sampled image patches and then obtain feature maps by convolving the image scene with the learned filters. After binarizing the feature maps, we perform a simple hashing step to convert the binary-valued feature map to the integer-valued feature map. Finally, statistical histograms computed on the integer-valued feature map are used as global feature representations of the scenes of HRRS images, similar to the conventional BOW model. The analysis of the algorithm complexity and experiments on HRRS image datasets demonstrate that, in contrast with existing scene classification approaches, the proposed FBC has much faster computational speed and achieves comparable classification performance. In addition, we also propose two extensions to FBC, i.e., the spatial co-occurrence matrix and different visual saliency maps, for further improving its final classification accuracy.

  11. Chirp-coded excitation imaging with a high-frequency ultrasound annular array.

    Science.gov (United States)

    Mamou, Jonathan; Ketterling, Jeffrey A; Silverman, Ronald H

    2008-02-01

    High-frequency ultrasound (HFU, > 15 MHz) is an effective means of obtaining fine-resolution images of biological tissues for applications such as opthalmologic, dermatologic, and small animal imaging. HFU has two inherent drawbacks. First, HFU images have a limited depth of field (DOF) because of the short wavelength and the low fixed F-number of conventional HFU transducers. Second, HFU can be used to image only a few millimeters deep into a tissue because attenuation increases with frequency. In this study, a five-element annular array was used in conjunction with a synthetic-focusing algorithm to extend the DOF. The annular array had an aperture of 10 mm, a focal length of 31 mm, and a center frequency of 17 MHz. To increase penetration depth, 8-micros, chirp-coded signals were designed, input into an arbitrary waveform generator, and used to excite each array element. After data acquisition, the received signals were linearly filtered to restore axial resolution and increase the SNR. To compare the chirpcoded imaging method with conventional impulse imaging in terms of resolution, a 25-microm diameter wire was scanned and the -6-dB axial and lateral resolutions were computed at depths ranging from 20.5 to 40.5 mm. The results demonstrated that chirp-coded excitation did not degrade axial or lateral resolution. A tissue-mimicking phantom containing 10-microm glass beads was scanned, and backscattered signals were analyzed to evaluate SNR and penetration depth. Finally, ex vivo ophthalmic images were formed and chirpcoded images showed features that were not visible in conventional impulse images.

  12. Butterflies of the high altitude Atacama Desert: habitat use and conservation

    Directory of Open Access Journals (Sweden)

    Emma eDespland

    2014-09-01

    Full Text Available The butterfly fauna of the high-altitude desert of Northern Chile, though depauperate, shows high endemism, is poorly known and is of considerable conservation concern. This study surveys butterflies along the Andean slope between 2400 and 500 m asl (prepuna, puna and Andean steppe habitats as well as in high and low altitude wetlands and in the neoriparian vegetation of agricultural sites. We also include historical sightings from museum records. We compare abundances between altitudes, between natural and impacted sites, as well as between two sampling years with different precipitation regimes. The results confirm high altitudinal turnover and show greatest similarity between wetland and slope faunas at similar altitudes. Results also underscore vulnerability to weather fluctuations, particularly in the more arid low-altitude sites, where abundances were much lower in the low precipitation sampling season and several species were not observed at all. Finally, we show that some species have shifted to the neoriparian vegetation of the agricultural landscape, whereas others were only observed in less impacted habitats dominated by native plants. These results suggest that acclimation to novel habitats depends on larval host plant use. The traditional agricultural environment can provide habitat for many, but not all, native butterfly species, but an estimation of the value of these habitats requires better understanding of butterfly life-history strategies and relationships with host plants.

  13. Butterflies of the high-altitude Atacama Desert: habitat use and conservation

    Science.gov (United States)

    Despland, Emma

    2014-01-01

    The butterfly fauna of the high-altitude desert of Northern Chile, though depauperate, shows high endemism, is poorly known and is of considerable conservation concern. This study surveys butterflies along the Andean slope between 2400 and 5000 m asl (prepuna, puna and Andean steppe habitats) as well as in high and low-altitude wetlands and in the neoriparian vegetation of agricultural sites. We also include historical sightings from museum records. We compare abundances between altitudes, between natural and impacted sites, as well as between two sampling years with different precipitation regimes. The results confirm high altitudinal turnover and show greatest similarity between wetland and slope faunas at similar altitudes. Results also underscore vulnerability to weather fluctuations, particularly in the more arid low-altitude sites, where abundances were much lower in the low precipitation sampling season and several species were not observed at all. Finally, we show that some species have shifted to the neoriparian vegetation of the agricultural landscape, whereas others were only observed in less impacted habitats dominated by native plants. These results suggest that acclimation to novel habitats depends on larval host plant use. The traditional agricultural environment can provide habitat for many, but not all, native butterfly species, but an estimation of the value of these habitats requires better understanding of butterfly life history strategies and relationships with host plants. PMID:25309583

  14. The importance of incorporating functional habitats into conservation planning for highly mobile species in dynamic systems.

    Science.gov (United States)

    Webb, Matthew H; Terauds, Aleks; Tulloch, Ayesha; Bell, Phil; Stojanovic, Dejan; Heinsohn, Robert

    2017-10-01

    The distribution of mobile species in dynamic systems can vary greatly over time and space. Estimating their population size and geographic range can be problematic and affect the accuracy of conservation assessments. Scarce data on mobile species and the resources they need can also limit the type of analytical approaches available to derive such estimates. We quantified change in availability and use of key ecological resources required for breeding for a critically endangered nomadic habitat specialist, the Swift Parrot (Lathamus discolor). We compared estimates of occupied habitat derived from dynamic presence-background (i.e., presence-only data) climatic models with estimates derived from dynamic occupancy models that included a direct measure of food availability. We then compared estimates that incorporate fine-resolution spatial data on the availability of key ecological resources (i.e., functional habitats) with more common approaches that focus on broader climatic suitability or vegetation cover (due to the absence of fine-resolution data). The occupancy models produced significantly (P < 0.001) smaller (up to an order of magnitude) and more spatially discrete estimates of the total occupied area than climate-based models. The spatial location and extent of the total area occupied with the occupancy models was highly variable between years (131 and 1498 km(2) ). Estimates accounting for the area of functional habitats were significantly smaller (2-58% [SD 16]) than estimates based only on the total area occupied. An increase or decrease in the area of one functional habitat (foraging or nesting) did not necessarily correspond to an increase or decrease in the other. Thus, an increase in the extent of occupied area may not equate to improved habitat quality or function. We argue these patterns are typical for mobile resource specialists but often go unnoticed because of limited data over relevant spatial and temporal scales and lack of spatial data on the

  15. FPGA-Based Channel Coding Architectures for 5G Wireless Using High-Level Synthesis

    Directory of Open Access Journals (Sweden)

    Swapnil Mhaske

    2017-01-01

    Full Text Available We propose strategies to achieve a high-throughput FPGA architecture for quasi-cyclic low-density parity-check codes based on circulant-1 identity matrix construction. By splitting the node processing operation in the min-sum approximation algorithm, we achieve pipelining in the layered decoding schedule without utilizing additional hardware resources. High-level synthesis compilation is used to design and develop the architecture on the FPGA hardware platform. To validate this architecture, an IEEE 802.11n compliant 608 Mb/s decoder is implemented on the Xilinx Kintex-7 FPGA using the LabVIEW FPGA Compiler in the LabVIEW Communication System Design Suite. Architecture scalability was leveraged to accomplish a 2.48 Gb/s decoder on a single Xilinx Kintex-7 FPGA. Further, we present rapidly prototyped experimentation of an IEEE 802.16 compliant hybrid automatic repeat request system based on the efficient decoder architecture developed. In spite of the mixed nature of data processing—digital signal processing and finite-state machines—LabVIEW FPGA Compiler significantly reduced time to explore the system parameter space and to optimize in terms of error performance and resource utilization. A 4x improvement in the system throughput, relative to a CPU-based implementation, was achieved to measure the error-rate performance of the system over large, realistic data sets using accelerated, in-hardware simulation.

  16. Computer code to predict the heat of explosion of high energy materials

    Energy Technology Data Exchange (ETDEWEB)

    Muthurajan, H. [Armament Research and Development Establishment, Pashan, Pune 411021 (India)], E-mail: muthurajan_h@rediffmail.com; Sivabalan, R.; Pon Saravanan, N.; Talawar, M.B. [High Energy Materials Research Laboratory, Sutarwadi, Pune 411 021 (India)

    2009-01-30

    The computational approach to the thermochemical changes involved in the process of explosion of a high energy materials (HEMs) vis-a-vis its molecular structure aids a HEMs chemist/engineers to predict the important thermodynamic parameters such as heat of explosion of the HEMs. Such a computer-aided design will be useful in predicting the performance of a given HEM as well as in conceiving futuristic high energy molecules that have significant potential in the field of explosives and propellants. The software code viz., LOTUSES developed by authors predicts various characteristics of HEMs such as explosion products including balanced explosion reactions, density of HEMs, velocity of detonation, CJ pressure, etc. The new computational approach described in this paper allows the prediction of heat of explosion ({delta}H{sub e}) without any experimental data for different HEMs, which are comparable with experimental results reported in literature. The new algorithm which does not require any complex input parameter is incorporated in LOTUSES (version 1.5) and the results are presented in this paper. The linear regression analysis of all data point yields the correlation coefficient R{sup 2} = 0.9721 with a linear equation y = 0.9262x + 101.45. The correlation coefficient value 0.9721 reveals that the computed values are in good agreement with experimental values and useful for rapid hazard assessment of energetic materials.

  17. Computer code to predict the heat of explosion of high energy materials.

    Science.gov (United States)

    Muthurajan, H; Sivabalan, R; Pon Saravanan, N; Talawar, M B

    2009-01-30

    The computational approach to the thermochemical changes involved in the process of explosion of a high energy materials (HEMs) vis-à-vis its molecular structure aids a HEMs chemist/engineers to predict the important thermodynamic parameters such as heat of explosion of the HEMs. Such a computer-aided design will be useful in predicting the performance of a given HEM as well as in conceiving futuristic high energy molecules that have significant potential in the field of explosives and propellants. The software code viz., LOTUSES developed by authors predicts various characteristics of HEMs such as explosion products including balanced explosion reactions, density of HEMs, velocity of detonation, CJ pressure, etc. The new computational approach described in this paper allows the prediction of heat of explosion (DeltaH(e)) without any experimental data for different HEMs, which are comparable with experimental results reported in literature. The new algorithm which does not require any complex input parameter is incorporated in LOTUSES (version 1.5) and the results are presented in this paper. The linear regression analysis of all data point yields the correlation coefficient R(2)=0.9721 with a linear equation y=0.9262x+101.45. The correlation coefficient value 0.9721 reveals that the computed values are in good agreement with experimental values and useful for rapid hazard assessment of energetic materials.

  18. Propagation of extragalactic photons at ultra-high energy with the EleCa code

    CERN Document Server

    Settimo, Mariangela

    2013-01-01

    Ultra-high energy (UHE) photons play an important role as an independent probe of the photo-pion production mechanism by UHE cosmic rays. Their observation, or non-observation, may constrain astrophysical scenarios for the origin of UHECRs and help to understand the nature of the flux suppression observed by several experiments at energies above $10^{19.5}$ eV. Whereas the interaction length of UHE photons above $10^{17}$ eV ranges from a few hundred kpc up to tenths of Mpc, photons can interact with the extragalactic background radiation initiating the development of electromagnetic cascades which affect the fluxes of photons observed at Earth. The interpretation of the current experimental results rely on the simulations of the UHE photon propagation. In this paper, we present the novel Monte Carlo code EleCa to simulate the $Ele$ctromagnetic $Ca$scading initiated by high-energy photons and electrons. We provide an estimation of the surviving probability for photons inducing electromagnetic cascades as a fu...

  19. High order symplectic conservative perturbation method for time-varying Hamiltonian system

    Institute of Scientific and Technical Information of China (English)

    Ming-Hui Fu; Ke-Lang Lu; Lin-Hua Lan

    2012-01-01

    This paper presents a high order symplectic conservative perturbation method for linear time-varying Hamiltonian system.Firstly,the dynamic equation of Hamiltonian system is gradually changed into a high order perturbation equation,which is solved approximately by resolving the Hamiltonian coefficient matrix into a "major component" and a "high order small quantity" and using perturbation transformation technique,then the solution to the original equation of Hamiltonian system is determined through a series of inverse transform.Because the transfer matrix determined by the method in this paper is the product of a series of exponential matrixes,the transfer matrix is a symplectic matrix; furthermore,the exponential matrices can be calculated accurately by the precise time integration method,so the method presented in this paper has fine accuracy,efficiency and stability.The examples show that the proposed method can also give good results even though a large time step is selected,and with the increase of the perturbation order,the perturbation solutions tend to exact solutions rapidly.

  20. The complex multidomain organization of SCO-spondin protein is highly conserved in mammals.

    Science.gov (United States)

    Meiniel, Olivier; Meiniel, Annie

    2007-02-01

    The multidomain organization of SCO-spondin protein is a special feature of the chordate phylum. This protein is expressed in the central nervous system (CNS) from the time a dorsal neural tube appears in the course of phylogenetical evolution. With the advance of the systematic whole genomes sequencing, we were able to determine the SCO-spondin amino acid sequence in four mammalian species using the Wise2 software. From the ClustalW alignment of bovine (Bos taurus), human (Homo sapiens), murine (Mus musculus) and rat (Rattus norvegicus) proteins, a consensus sequence for mammalian SCO-spondin was determined and further validated with the dog (Canis familiaris) SCO-spondin sequence. The analysis of this consensus sequence is consistent with a very high degree of conservation in the amino acids composition and multidomain organization of SCO-spondin in mammals. In addition, the identification of conserved domains, namely, Emilin (EMI), von Willebrand factor D (vWD), low-density lipoprotein receptor type A (LDLrA) domains, SCO repeats (SCOR), thrombospondin type 1 repeats (TSR), a coagulation factor 5/8 type C (FA5-8C) or discoidin motif and a C-terminal cystine knot (CTCK) domain, provides a greater insight into the putative function of this multidomain protein. SCO-spondin belongs to the TSR superfamily given the presence of a great number of TSR (26). A finer classification of the TSR motifs in groups 1, 2 and 3 is proposed on the basis of different cysteine patterns. Interestingly, group 2 TSR are present in a number of CNS developmental proteins including R-spondins, F-spondins and Mindins.

  1. Space Time Codes from Permutation Codes

    CERN Document Server

    Henkel, Oliver

    2006-01-01

    A new class of space time codes with high performance is presented. The code design utilizes tailor-made permutation codes, which are known to have large minimal distances as spherical codes. A geometric connection between spherical and space time codes has been used to translate them into the final space time codes. Simulations demonstrate that the performance increases with the block lengths, a result that has been conjectured already in previous work. Further, the connection to permutation codes allows for moderate complex en-/decoding algorithms.

  2. Overcoming Challenges in Engineering the Genetic Code.

    Science.gov (United States)

    Lajoie, M J; Söll, D; Church, G M

    2016-02-27

    Withstanding 3.5 billion years of genetic drift, the canonical genetic code remains such a fundamental foundation for the complexity of life that it is highly conserved across all three phylogenetic domains. Genome engineering technologies are now making it possible to rationally change the genetic code, offering resistance to viruses, genetic isolation from horizontal gene transfer, and prevention of environmental escape by genetically modified organisms. We discuss the biochemical, genetic, and technological challenges that must be overcome in order to engineer the genetic code.

  3. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    Directory of Open Access Journals (Sweden)

    Wei Feng

    2016-03-01

    Full Text Available High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device or CMOS (complementary metal oxide semiconductor camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second gain in temporal resolution by using a 25 fps camera.

  4. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera.

    Science.gov (United States)

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-03-04

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera.

  5. The highly conserved MraZ protein is a transcriptional regulator in Escherichia coli

    Energy Technology Data Exchange (ETDEWEB)

    Eraso, Jesus M.; Markillie, Lye Meng; Mitchell, Hugh D.; Taylor, Ronald C.; Orr, Galya; Margolin, William

    2014-05-05

    The mraZ and mraW genes are highly conserved in bacteria, both in sequence and location at the head of the division and cell wall (dcw) gene cluster. Although MraZ has structural similarity to the AbrB transition state regulator and the MazE antitoxin, and MraW is known to methylate ribosomal RNA, mraZ and mraW null mutants have no detectable growth phenotype in any species tested to date, hampering progress in understanding their physiological role. Here we show that overproduction of Escherichia coli MraZ perturbs cell division and the cell envelope, is more lethal at high levels or in minimal growth medium, and that MraW antagonizes these effects. MraZGFP localizes to the nucleoid, suggesting that it binds DNA. Indeed, purified MraZ directly binds a region upstream from its own promoter containing three direct repeats to regulate its own expression and that of downstream cell division and cell wall genes. MraZ-LacZ fusions are repressed by excess MraZ but not when DNA binding by MraZ is inhibited. RNAseq analysis indicates that MraZ is a global transcriptional regulator with numerous targets in addition to dcw genes. One of these targets, mioC, is directly bound by MraZ in a region with three direct repeats.

  6. HyCFS, a high-resolution shock capturing code for numerical simulation on hybrid computational clusters

    Science.gov (United States)

    Shershnev, Anton A.; Kudryavtsev, Alexey N.; Kashkovsky, Alexander V.; Khotyanovsky, Dmitry V.

    2016-10-01

    The present paper describes HyCFS code, developed for numerical simulation of compressible high-speed flows on hybrid CPU/GPU (Central Processing Unit / Graphical Processing Unit) computational clusters on the basis of full unsteady Navier-Stokes equations, using modern shock capturing high-order TVD (Total Variation Diminishing) and WENO (Weighted Essentially Non-Oscillatory) schemes on general curvilinear structured grids. We discuss the specific features of hybrid architecture and details of program implementation and present the results of code verification.

  7. Swertia chirayta, a Threatened High-Value Medicinal Herb: Microhabitats and Conservation Challenges in Sikkim Himalaya, India

    Directory of Open Access Journals (Sweden)

    Bharat Kumar Pradhan

    2015-11-01

    Full Text Available Assessing the impact of threats, identifying favorable growing conditions, and predicting future population scenarios are vital for the conservation and management of threatened species. This study investigated the availability, microhabitat characteristics, threat status, and community associations of Swertia chirayta, a highly threatened Himalayan medicinal herb, in 22 populations in Sikkim, India, using the vertical belt transect method. Of the 14 microhabitats identified, open grassy slope emerged as the most favorable and wet grassy slope as the least favorable for S. chirayta. The species was dominant in 8 of the 10 major plant communities identified. Among 9 major types of disturbance identified, human movement and collection of non-timber forest products appeared as the biggest threats to S. chirayta. Disturbances significantly affected the availability of the species. S. chirayta, though under high anthropogenic threat, maintains high microhabitat pliability, which is vital for its conservation and management, provided immediate conservation measures are taken.

  8. 7 CFR 12.23 - Conservation plans and conservation systems.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Conservation plans and conservation systems. 12.23 Section 12.23 Agriculture Office of the Secretary of Agriculture HIGHLY ERODIBLE LAND AND WETLAND CONSERVATION Highly Erodible Land Conservation § 12.23 Conservation plans and conservation systems. (a) Use...

  9. Development of a shock noise prediction code for high-speed helicopters - The subsonically moving shock

    Science.gov (United States)

    Tadghighi, H.; Holz, R.; Farassat, F.; Lee, Yung-Jang

    1991-01-01

    A previously defined airfoil subsonic shock-noise prediction formula whose result depends on a mapping of the time-dependent shock surface to a time-independent computational domain is presently coded and incorporated in the NASA-Langley rotor-noise prediction code, WOPWOP. The structure and algorithms used in the shock-noise prediction code are presented; special care has been taken to reduce computation time while maintaining accuracy. Numerical examples of shock-noise prediction are presented for hover and forward flight. It is confirmed that shock noise is an important component of the quadrupole source.

  10. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    Science.gov (United States)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ˜-28.3 dBm and ˜336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  11. CATARACT: Computer code for improving power calculations at NREL's high-flux solar furnace

    Science.gov (United States)

    Scholl, K.; Bingham, C.; Lewandowski, A.

    1994-01-01

    The High-Flux Solar Furnace (HFSF), operated by the National Renewable Energy Laboratory, uses a camera-based, flux-mapping system to analyze the distribution and to determine total power at the focal point. The flux-mapping system consists of a diffusively reflecting plate with seven circular foil calorimeters, a charge-coupled device (CCD) camera, an IBM-compatible personal computer with a frame-grabber board, and commercial image analysis software. The calorimeters provide flux readings that are used to scale the image captured from the plate by the camera. The image analysis software can estimate total power incident on the plate by integrating under the 3-dimensional image. Because of the physical layout of the HFSF, the camera is positioned at a 20 angle to the flux mapping plate normal. The foreshortening of the captured images that results represents a systematic error in the power calculations because the software incorrectly assumes the image is parallel to the camera's array. We have written a FORTRAN computer program called CATARACT (camera/target angle correction) that we use to transform the original flux-mapper image to a plane that is normal to the camera's optical axis. A description of the code and the results of experiments performed to verify it are presented. Also presented are comparisons of the total power available from the HFSF as determined from the flux mapping system and theoretical considerations.

  12. Additions and improvements to the high energy density physics capabilities in the FLASH code

    Science.gov (United States)

    Lamb, D. Q.; Flocke, N.; Graziani, C.; Tzeferacos, P.; Weide, K.

    2016-10-01

    FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities have been added to FLASH to make it an open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. In particular, we showcase the ability of FLASH to simulate the Faraday Rotation Measure produced by the presence of magnetic fields; and proton radiography, proton self-emission, and Thomson scattering diagnostics with and without the presence of magnetic fields. We also describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under Grant PHY-0903997.

  13. HIGHLY SECURE KEY PREDISTRIBUTION USING AFFINE PLANES AND REED MULLER CODES IN WIRELESS SENSOR NETWORKS

    Directory of Open Access Journals (Sweden)

    Pinaki Sarkar

    2011-10-01

    Full Text Available Wireless Sensor Networks (WSN consist of low powered and resource constrained sensor nodes which are left unattended for long duration of time. Hence it is very challenging to design and implement cost effective security protocols for such networks. Thus symmetric key cryptographic techniques are preferred over public key techniques for communication in such scenarios. Prior to deployment, keys are usually predistributed into the nodes and this problem has been well studied. Highlighting that connectivity and communication are two separate aspects of a WSN, we propose a secure connectivity model using Reed Muller codes. The model is then utilized to securely establish communication keys and exchange messages in a WSN designed on the basis of a scheme that uses affine planes for key predistribution. By the introduction of connectivity model, the node identifiers (ids are converted from public to private information to each node. These private node ids can be used to generate new communication keys from old ones by applying cryptographic hash functions. Novel combination of these ideas yields highly resilient communication model with full connectivity between nodes.

  14. HERMES: a Monte Carlo Code for the Propagation of Ultra-High Energy Nuclei

    CERN Document Server

    De Domenico, Manlio; Settimo, Mariangela

    2013-01-01

    Although the recent experimental efforts to improve the observation of Ultra-High Energy Cosmic Rays (UHECRs) above $10^{18}$ eV, the origin and the composition of such particles is still unknown. In this work, we present the novel Monte Carlo code (HERMES) simulating the propagation of UHE nuclei, in the energy range between $10^{16}$ and $10^{22}$ eV, accounting for propagation in the intervening extragalactic and Galactic magnetic fields and nuclear interactions with relic photons of the extragalactic background radiation. In order to show the potential applications of HERMES for astroparticle studies, we estimate the expected flux of UHE nuclei in different astrophysical scenarios, the GZK horizons and we show the expected arrival direction distributions in the presence of turbulent extragalactic magnetic fields. A stable version of HERMES will be released in the next future for public use together with libraries of already propagated nuclei to allow the community to perform mass composition and energy sp...

  15. Development of Safety Analysis Codes and Experimental Validation for a Very High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang, H. Oh, PhD; Cliff Davis; Richard Moore

    2004-11-01

    The very high temperature gas-cooled reactors (VHTGRs) are those concepts that have average coolant temperatures above 900 degrees C or operational fuel temperatures above 1250 degrees C. These concepts provide the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation and nuclear hydrogen generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperatures to support process heat applications, such as desalination and cogeneration, the VHTGR's higher temperatures are suitable for particular applications such as thermochemical hydrogen production. However, the high temperature operation can be detrimental to safety following a loss-of-coolant accident (LOCA) initiated by pipe breaks caused by seismic or other events. Following the loss of coolant through the break and coolant depressurization, air from the containment will enter the core by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structures and fuel. The oxidation will release heat and accelerate the heatup of the reactor core. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. The Idaho National Engineering and Environmental Laboratory (INEEL) has investigated this event for the past three years for the HTGR. However, the computer codes used, and in fact none of the world's computer codes, have been sufficiently developed and validated to reliably predict this event. New code development, improvement of the existing codes, and experimental validation are imperative to narrow the uncertaninty in the predictions of this type of accident. The objectives of this Korean/United States collaboration are to develop advanced computational methods for VHTGR safety analysis codes and to validate these computer codes.

  16. Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code

    Energy Technology Data Exchange (ETDEWEB)

    Panettieri, Vanessa [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Duch, Maria Amor [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Jornet, Nuria [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain); Ginjaume, Merce [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Carrasco, Pablo [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain); Badal, Andreu [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Ortega, Xavier [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Ribas, Montserrat [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain)

    2007-01-07

    The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson and Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm{sup 2} and a thickness of 0.5 {mu}m which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water(TM) build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water(TM) cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system

  17. Streamlining and core genome conservation among highly divergent members of the SAR11 clade.

    Science.gov (United States)

    Grote, Jana; Thrash, J Cameron; Huggett, Megan J; Landry, Zachary C; Carini, Paul; Giovannoni, Stephen J; Rappé, Michael S

    2012-01-01

    SAR11 is an ancient and diverse clade of heterotrophic bacteria that are abundant throughout the world's oceans, where they play a major role in the ocean carbon cycle. Correlations between the phylogenetic branching order and spatiotemporal patterns in cell distributions from planktonic ocean environments indicate that SAR11 has evolved into perhaps a dozen or more specialized ecotypes that span evolutionary distances equivalent to a bacterial order. We isolated and sequenced genomes from diverse SAR11 cultures that represent three major lineages and encompass the full breadth of the clade. The new data expand observations about genome evolution and gene content that previously had been restricted to the SAR11 Ia subclade, providing a much broader perspective on the clade's origins, evolution, and ecology. We found small genomes throughout the clade and a very high proportion of core genome genes (48 to 56%), indicating that small genome size is probably an ancestral characteristic. In their level of core genome conservation, the members of SAR11 are outliers, the most conserved free-living bacteria known. Shared features of the clade include low GC content, high gene synteny, a large hypervariable region bounded by rRNA genes, and low numbers of paralogs. Variation among the genomes included genes for phosphorus metabolism, glycolysis, and C1 metabolism, suggesting that adaptive specialization in nutrient resource utilization is important to niche partitioning and ecotype divergence within the clade. These data provide support for the conclusion that streamlining selection for efficient cell replication in the planktonic habitat has occurred throughout the evolution and diversification of this clade. IMPORTANCE The SAR11 clade is the most abundant group of marine microorganisms worldwide, making them key players in the global carbon cycle. Growing knowledge about their biochemistry and metabolism is leading to a more mechanistic understanding of organic carbon

  18. A2 gene of Old World cutaneous Leishmania is a single highly conserved functional gene

    Directory of Open Access Journals (Sweden)

    Derouin Francis

    2005-03-01

    Full Text Available Abstract Background Leishmaniases are among the most proteiform parasitic infections in humans ranging from unapparent to cutaneous, mucocutaneous or visceral diseases. The various clinical issues depend on complex and still poorly understood mechanisms where both host and parasite factors are interacting. Among the candidate factors of parasite virulence are the A2 genes, a family of multiple genes that are developmentally expressed in species of the Leishmania donovani group responsible for visceral diseases (VL. By contrast, in L. major determining cutaneous infections (CL we showed that A2 genes are present in a truncated form only. Furthermore, the A2 genomic sequences of L. major were considered subsequently to represent non-expressed pseudogenes 1. Consequently, it was suggested that the structural and functional properties of A2 genes could play a role in the differential tropism of CL and VL leishmanias. On this basis, it was of importance to determine whether the observed structural/functional particularities of the L. major A2 genes were shared by other CL Leishmania, therefore representing a proper characteristic of CL A2 genes as opposed to those of VL isolates. Methods In the present study we amplified by PCR and sequenced the A2 genes from genomic DNA and from clonal libraries of the four Old World CL species comparatively to a clonal population of L. infantum VL parasites. Using RT-PCR we also amplified and sequenced A2 mRNA transcripts from L. major. Results A unique A2 sequence was identified in Old World cutaneous Leishmania by sequencing. The shared sequence was highly conserved among the various CL strains and species analysed, showing a single polymorphism C/G at position 58. The CL A2 gene was found to be functionally transcribed at both parasite stages. Conclusion The present study shows that cutaneous strains of leishmania share a conserved functional A2 gene. As opposed to the multiple A2 genes described in VL isolates

  19. Monitoring adherence to the international code of conduct: highly hazardous pesticides in central Andean agriculture and farmers' rights to health.

    Science.gov (United States)

    Orozco, Fadya A; Cole, Donald C; Forbes, Greg; Kroschel, Jürgen; Wanigaratne, Susitha; Arica, Denis

    2009-01-01

    The WHO has advocated monitoring adherence to the Food and Agriculture Organization's Code of Conduct to reduce use of highly hazardous pesticides in lower and middle income countries. We re-framed Code articles in terms of farmers' rights and drew on survey data, farmer focus group results, and direct observations of agrochemical stores in Ecuador and Peru to construct indicators reflecting respect for such rights. Use of highly (Ia and Ib) and moderately (II) hazardous pesticides was common. Worse indicators were observed in places with lower education, greater poverty, and more use of indigenous languages. Limited government enforcement capacity, social irresponsibility of the pesticide industry, and lack of farmers' knowledge of the Code were all factors impeding respect for farmers' rights. Addressing the power imbalance among social actors requires informed farmer and farmworker participation in monitoring adherence and active involvement of non-governmental organizations and municipal governments.

  20. High-dynamic range compressive spectral imaging by grayscale coded aperture adaptive filtering

    Directory of Open Access Journals (Sweden)

    Nelson Eduardo Diaz

    2015-12-01

    Full Text Available The coded aperture snapshot spectral imaging system (CASSI is an imaging architecture which senses the three dimensional informa-tion of a scene with two dimensional (2D focal plane array (FPA coded projection measurements. A reconstruction algorithm takes advantage of the compressive measurements sparsity to recover the underlying 3D data cube. Traditionally, CASSI uses block-un-block coded apertures (BCA to spatially modulate the light. In CASSI the quality of the reconstructed images depends on the design of these coded apertures and the FPA dynamic range. This work presents a new CASSI architecture based on grayscaled coded apertu-res (GCA which reduce the FPA saturation and increase the dynamic range of the reconstructed images. The set of GCA is calculated in a real-time adaptive manner exploiting the information from the FPA compressive measurements. Extensive simulations show the attained improvement in the quality of the reconstructed images when GCA are employed.  In addition, a comparison between traditional coded apertures and GCA is realized with respect to noise tolerance.

  1. Optical noise-free image encryption based on quick response code and high dimension chaotic system in gyrator transform domain

    Science.gov (United States)

    Sui, Liansheng; Xu, Minjie; Tian, Ailing

    2017-04-01

    A novel optical image encryption scheme is proposed based on quick response code and high dimension chaotic system, where only the intensity distribution of encoded information is recorded as ciphertext. Initially, the quick response code is engendered from the plain image and placed in the input plane of the double random phase encoding architecture. Then, the code is encrypted to the ciphertext with noise-like distribution by using two cascaded gyrator transforms. In the process of encryption, the parameters such as rotation angles and random phase masks are generated as interim variables and functions based on Chen system. A new phase retrieval algorithm is designed to reconstruct the initial quick response code in the process of decryption, in which a priori information such as three position detection patterns is used as the support constraint. The original image can be obtained without any energy loss by scanning the decrypted code with mobile devices. The ciphertext image is the real-valued function which is more convenient for storing and transmitting. Meanwhile, the security of the proposed scheme is enhanced greatly due to high sensitivity of initial values of Chen system. Extensive cryptanalysis and simulation have performed to demonstrate the feasibility and effectiveness of the proposed scheme.

  2. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band.

    Science.gov (United States)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta; Yu, Xianbin; Ukhanova, Anna; Llorente, Roberto; Monroy, Idelfonso Tafur; Forchhammer, Søren

    2011-12-12

    The paper addresses the problem of distribution of high-definition video over fiber-wireless networks. The physical layer architecture with the low complexity envelope detection solution is investigated. We present both experimental studies and simulation of high quality high-definition compressed video transmission over 60 GHz fiber-wireless link. Using advanced video coding we satisfy low complexity and low delay constraints, meanwhile preserving the superb video quality after significantly extended wireless distance.

  3. Spatially conserved regulatory elements identified within human and mouse Cd247 gene using high-throughput sequencing data from the ENCODE project

    DEFF Research Database (Denmark)

    Pundhir, Sachin; Hannibal, Tine Dahlbæk; Bang-Berthelsen, Claus Heiner

    2014-01-01

    , supported by histone marks and ChIP-seq data, that specifically have features of an enhancer and a promoter, respectively. We also identified a putative long non-coding RNA from the characteristically long first intron of the Cd247 gene. The long non-coding RNA annotation is supported by manual annotations...... from the GENCODE project in human and our expression quantification analysis performed in NOD and B6 mice using qRT-PCR. Furthermore, 17 of the 23 SNPs already known to be implicated with T1D were observed within the long non-coding RNA region in mouse. The spatially conserved regulatory elements...

  4. Highly conserved D-loop-like nuclear mitochondrial sequences (Numts) in tiger (Panthera tigris)

    Indian Academy of Sciences (India)

    Wenping Zhang; Zhihe Zhang; Fujun Shen; Rong Hou; Xiaoping Lv; Bisong Yue

    2006-08-01

    Using oligonucleotide primers designed to match hypervariable segments I (HVS-1) of Panthera tigris mitochondrial DNA (mtDNA), we amplified two different PCR products (500 bp and 287 bp) in the tiger (Panthera tigris), but got only one PCR product (287 bp) in the leopard (Panthera pardus). Sequence analyses indicated that the sequence of 287 bp was a D-loop-like nuclear mitochondrial sequence (Numts), indicating a nuclear transfer that occurred approximately 4.8–17 million years ago in the tiger and 4.6–16 million years ago in the leopard. Although the mtDNA D-loop sequence has a rapid rate of evolution, the 287-bp Numts are highly conserved; they are nearly identical in tiger subspecies and only 1.742% different between tiger and leopard. Thus, such sequences represent molecular ‘fossils’ that can shed light on evolution of the mitochondrial genome and may be the most appropriate outgroup for phylogenetic analysis. This is also proved by comparing the phylogenetic trees reconstructed using the D-loop sequence of snow leopard and the 287-bp Numts as outgroup.

  5. High Throughput Sequencing of T Cell Antigen Receptors Reveals a Conserved TCR Repertoire

    Science.gov (United States)

    Hou, Xianliang; Lu, Chong; Chen, Sisi; Xie, Qian; Cui, Guangying; Chen, Jianing; Chen, Zhi; Wu, Zhongwen; Ding, Yulong; Ye, Ping; Dai, Yong; Diao, Hongyan

    2016-01-01

    Abstract The T-cell receptor (TCR) repertoire is a mirror of the human immune system that reflects processes caused by infections, cancer, autoimmunity, and aging. Next-generation sequencing has become a powerful tool for deep TCR profiling. Herein, we used this technology to study the repertoire features of TCR beta chain in the blood of healthy individuals. Peripheral blood samples were collected from 10 healthy donors. T cells were isolated with anti-human CD3 magnetic beads according to the manufacturer's protocol. We then combined multiplex-PCR, Illumina sequencing, and IMGT/High V-QUEST to analyze the characteristics and polymorphisms of the TCR. Most of the individual T cell clones were present at very low frequencies, suggesting that they had not undergone clonal expansion. The usage frequencies of the TCR beta variable, beta joining, and beta diversity gene segments were similar among T cells from different individuals. Notably, the usage frequency of individual nucleotides and amino acids within complementarity-determining region (CDR3) intervals was remarkably consistent between individuals. Moreover, our data show that terminal deoxynucleotidyl transferase activity was biased toward the insertion of G (31.92%) and C (27.14%) over A (21.82%) and T (19.12%) nucleotides. Some conserved features could be observed in the composition of CDR3, which may inform future studies of human TCR gene recombination. PMID:26962778

  6. Homoeologous chromosomes of Xenopus laevis are highly conserved after whole-genome duplication.

    Science.gov (United States)

    Uno, Y; Nishida, C; Takagi, C; Ueno, N; Matsuda, Y

    2013-11-01

    It has been suggested that whole-genome duplication (WGD) occurred twice during the evolutionary process of vertebrates around 450 and 500 million years ago, which contributed to an increase in the genomic and phenotypic complexities of vertebrates. However, little is still known about the evolutionary process of homoeologous chromosomes after WGD because many duplicate genes have been lost. Therefore, Xenopus laevis (2n=36) and Xenopus (Silurana) tropicalis (2n=20) are good animal models for studying the process of genomic and chromosomal reorganization after WGD because X. laevis is an allotetraploid species that resulted from WGD after the interspecific hybridization of diploid species closely related to X. tropicalis. We constructed a comparative cytogenetic map of X. laevis using 60 complimentary DNA clones that covered the entire chromosomal regions of 10 pairs of X. tropicalis chromosomes. We consequently identified all nine homoeologous chromosome groups of X. laevis. Hybridization signals on two pairs of X. laevis homoeologous chromosomes were detected for 50 of 60 (83%) genes, and the genetic linkage is highly conserved between X. tropicalis and X. laevis chromosomes except for one fusion and one inversion and also between X. laevis homoeologous chromosomes except for two inversions. These results indicate that the loss of duplicated genes and inter- and/or intrachromosomal rearrangements occurred much less frequently in this lineage, suggesting that these events were not essential for diploidization of the allotetraploid genome in X. laevis after WGD.

  7. Specific binding of eukaryotic ORC to DNA replication origins depends on highly conserved basic residues.

    Science.gov (United States)

    Kawakami, Hironori; Ohashi, Eiji; Kanamoto, Shota; Tsurimoto, Toshiki; Katayama, Tsutomu

    2015-10-12

    In eukaryotes, the origin recognition complex (ORC) heterohexamer preferentially binds replication origins to trigger initiation of DNA replication. Crystallographic studies using eubacterial and archaeal ORC orthologs suggested that eukaryotic ORC may bind to origin DNA via putative winged-helix DNA-binding domains and AAA+ ATPase domains. However, the mechanisms how eukaryotic ORC recognizes origin DNA remain elusive. Here, we show in budding yeast that Lys-362 and Arg-367 residues of the largest subunit (Orc1), both outside the aforementioned domains, are crucial for specific binding of ORC to origin DNA. These basic residues, which reside in a putative disordered domain, were dispensable for interaction with ATP and non-specific DNA sequences, suggesting a specific role in recognition. Consistent with this, both residues were required for origin binding of Orc1 in vivo. A truncated Orc1 polypeptide containing these residues solely recognizes ARS sequence with low affinity and Arg-367 residue stimulates sequence specific binding mode of the polypeptide. Lys-362 and Arg-367 residues of Orc1 are highly conserved among eukaryotic ORCs, but not in eubacterial and archaeal orthologs, suggesting a eukaryote-specific mechanism underlying recognition of replication origins by ORC.

  8. A Highly Conserved Bacterial D-Serine Uptake System Links Host Metabolism and Virulence.

    Science.gov (United States)

    Connolly, James P R; Gabrielsen, Mads; Goldstone, Robert J; Grinter, Rhys; Wang, Dai; Cogdell, Richard J; Walker, Daniel; Smith, David G E; Roe, Andrew J

    2016-01-01

    The ability of any organism to sense and respond to challenges presented in the environment is critically important for promoting or restricting colonization of specific sites. Recent work has demonstrated that the host metabolite D-serine has the ability to markedly influence the outcome of infection by repressing the type III secretion system of enterohaemorrhagic Escherichia coli (EHEC) in a concentration-dependent manner. However, exactly how EHEC monitors environmental D-serine is not understood. In this work, we have identified two highly conserved members of the E. coli core genome, encoding an inner membrane transporter and a transcriptional regulator, which collectively help to "sense" levels of D-serine by regulating its uptake from the environment and in turn influencing global gene expression. Both proteins are required for full expression of the type III secretion system and diversely regulated prophage-encoded effector proteins demonstrating an important infection-relevant adaptation of the core genome. We propose that this system acts as a key safety net, sampling the environment for this metabolite, thereby promoting colonization of EHEC to favorable sites within the host.

  9. A Highly Conserved Bacterial D-Serine Uptake System Links Host Metabolism and Virulence.

    Directory of Open Access Journals (Sweden)

    James P R Connolly

    2016-01-01

    Full Text Available The ability of any organism to sense and respond to challenges presented in the environment is critically important for promoting or restricting colonization of specific sites. Recent work has demonstrated that the host metabolite D-serine has the ability to markedly influence the outcome of infection by repressing the type III secretion system of enterohaemorrhagic Escherichia coli (EHEC in a concentration-dependent manner. However, exactly how EHEC monitors environmental D-serine is not understood. In this work, we have identified two highly conserved members of the E. coli core genome, encoding an inner membrane transporter and a transcriptional regulator, which collectively help to "sense" levels of D-serine by regulating its uptake from the environment and in turn influencing global gene expression. Both proteins are required for full expression of the type III secretion system and diversely regulated prophage-encoded effector proteins demonstrating an important infection-relevant adaptation of the core genome. We propose that this system acts as a key safety net, sampling the environment for this metabolite, thereby promoting colonization of EHEC to favorable sites within the host.

  10. A highly conserved metalloprotease effector enhances virulence in the maize anthracnose fungus Colletotrichum graminicola.

    Science.gov (United States)

    Sanz-Martín, José M; Pacheco-Arjona, José Ramón; Bello-Rico, Víctor; Vargas, Walter A; Monod, Michel; Díaz-Mínguez, José M; Thon, Michael R; Sukno, Serenella A

    2016-09-01

    Colletotrichum graminicola causes maize anthracnose, an agronomically important disease with a worldwide distribution. We have identified a fungalysin metalloprotease (Cgfl) with a role in virulence. Transcriptional profiling experiments and live cell imaging show that Cgfl is specifically expressed during the biotrophic stage of infection. To determine whether Cgfl has a role in virulence, we obtained null mutants lacking Cgfl and performed pathogenicity and live microscopy assays. The appressorium morphology of the null mutants is normal, but they exhibit delayed development during the infection process on maize leaves and roots, showing that Cgfl has a role in virulence. In vitro chitinase activity assays of leaves infected with wild-type and null mutant strains show that, in the absence of Cgfl, maize leaves exhibit increased chitinase activity. Phylogenetic analyses show that Cgfl is highly conserved in fungi. Similarity searches, phylogenetic analysis and transcriptional profiling show that C. graminicola encodes two LysM domain-containing homologues of Ecp6, suggesting that this fungus employs both Cgfl-mediated and LysM protein-mediated strategies to control chitin signalling. © 2015 BSPP and John Wiley & Sons Ltd.

  11. A dominant EV71-specific CD4+ T cell epitope is highly conserved among human enteroviruses.

    Directory of Open Access Journals (Sweden)

    Ruicheng Wei

    Full Text Available CD4+ T cell-mediated immunity plays a central role in determining the immunopathogenesis of viral infections. However, the role of CD4+ T cells in EV71 infection, which causes hand, foot and mouth disease (HFMD, has yet to be elucidated. We applied a sophisticated method to identify promiscuous CD4+ T cell epitopes contained within the sequence of the EV71 polyprotein. Fifteen epitopes were identified, and three of them are dominant ones. The most dominant epitope is highly conserved among enterovirus species, including HFMD-related coxsackieviruses, HFMD-unrelated echoviruses and polioviruses. Furthermore, the CD4+ T cells specific to the epitope indeed cross-reacted with the homolog of poliovirus 3 Sabin. Our findings imply that CD4+ T cell responses to poliovirus following vaccination, or to other enteroviruses to which individuals may be exposed in early childhood, may have a modulating effect on subsequent CD4+ T cell response to EV71 infection or vaccine.

  12. High Scales On the Strong Vocational Interest Blank and the Kuder Occupational Interest Survey Using Holland's Occupational Codes

    Science.gov (United States)

    Westbrook, Franklin D.

    1975-01-01

    The study compared the arrays of high-interest occupations produced by the Strong and the Kuder. A frequency percentage count showed 85 percent of the pairs of summary codes had two identical characteristics, and some support was found for Holland's hexagon. The implications for further studies comparing the two instruments are discussed. (Author)

  13. Design of ecoregional monitoring in conservation areas of high-latitude ecosystems under contemporary climate change

    Science.gov (United States)

    Beever, Erik A.; Woodward, Andrea

    2011-01-01

    Land ownership in Alaska includes a mosaic of federally managed units. Within its agency’s context, each unit has its own management strategy, authority, and resources of conservation concern, many of which are migratory animals. Though some units are geographically isolated, many are nevertheless linked by paths of abiotic and biotic flows, such as rivers, air masses, flyways, and terrestrial and aquatic migration routes. Furthermore, individual land units exist within the context of a larger landscape pattern of shifting conditions, requiring managers to understand at larger spatial scales the status and trends in the synchrony and spatial concurrence of species and associated suitable habitats. Results of these changes will determine the ability of Alaska lands to continue to: provide habitat for local and migratory species; absorb species whose ranges are shifting northward; and experience mitigation or exacerbation of climate change through positive and negative atmospheric feedbacks. We discuss the geographic and statutory contexts that influence development of ecological monitoring; argue for the inclusion of significant amounts of broad-scale monitoring; discuss the importance of defining clear programmatic and monitoring objectives; and draw from lessons learned from existing long-term, broad-scale monitoring programs to apply to the specific contexts relevant to high-latitude protected areas such as those in Alaska. Such areas are distinguished by their: marked seasonality; relatively large magnitudes of contemporary change in climatic parameters; and relative inaccessibility due to broad spatial extent, very low (or zero) road density, and steep and glaciated areas. For ecological monitoring to effectively support management decisions in high-latitude areas such as Alaska, a monitoring program ideally would be structured to address the actual spatial and temporal scales of relevant processes, rather than the artificial boundaries of individual land

  14. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 4: Computer user's manual for UAAP turboprop aeroacoustic code

    Science.gov (United States)

    Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.

    1991-01-01

    The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.

  15. High-dimensional structured light coding/decoding for free-space optical communications free of obstructions.

    Science.gov (United States)

    Du, Jing; Wang, Jian

    2015-11-01

    Bessel beams carrying orbital angular momentum (OAM) with helical phase fronts exp(ilφ)(l=0;±1;±2;…), where φ is the azimuthal angle and l corresponds to the topological number, are orthogonal with each other. This feature of Bessel beams provides a new dimension to code/decode data information on the OAM state of light, and the theoretical infinity of topological number enables possible high-dimensional structured light coding/decoding for free-space optical communications. Moreover, Bessel beams are nondiffracting beams having the ability to recover by themselves in the face of obstructions, which is important for free-space optical communications relying on line-of-sight operation. By utilizing the OAM and nondiffracting characteristics of Bessel beams, we experimentally demonstrate 12 m distance obstruction-free optical m-ary coding/decoding using visible Bessel beams in a free-space optical communication system. We also study the bit error rate (BER) performance of hexadecimal and 32-ary coding/decoding based on Bessel beams with different topological numbers. After receiving 500 symbols at the receiver side, a zero BER of hexadecimal coding/decoding is observed when the obstruction is placed along the propagation path of light.

  16. Two high-density recording methods with run-length limited turbo code for holographic data storage system

    Science.gov (United States)

    Nakamura, Yusuke; Hoshizawa, Taku

    2016-09-01

    Two methods for increasing the data capacity of a holographic data storage system (HDSS) were developed. The first method is called “run-length-limited (RLL) high-density recording”. An RLL modulation has the same effect as enlarging the pixel pitch; namely, it optically reduces the hologram size. Accordingly, the method doubles the raw-data recording density. The second method is called “RLL turbo signal processing”. The RLL turbo code consists of \\text{RLL}(1,∞ ) trellis modulation and an optimized convolutional code. The remarkable point of the developed turbo code is that it employs the RLL modulator and demodulator as parts of the error-correction process. The turbo code improves the capability of error correction more than a conventional LDPC code, even though interpixel interference is generated. These two methods will increase the data density 1.78-fold. Moreover, by simulation and experiment, a data density of 2.4 Tbit/in.2 is confirmed.

  17. Development and application of a deflagration pressure analysis code for high level waste processing

    Energy Technology Data Exchange (ETDEWEB)

    Hensel, S.J.; Thomas, J.K.

    1994-06-01

    The Deflagration Pressure Analysis Code (DPAC) was developed primarily to evaluate peak pressures for deflagrations in radioactive waste storage and process facilities at the Savannah River Site (SRS). Deflagrations in these facilities are generally considered to be incredible events, but it was judged prudent to develop modeling capabilities in order to facilitate risk estimates. DPAC is essentially an engineering analysis tool, as opposed to a detailed thermal hydraulics code. It accounts for mass loss via venting, energy dissipation by radiative heat transfer, and gas PdV work. Volume increases due to vessel deformation can also be included using pressure-volume data from a structural analysis of the enclosure. This paper presents an overview of the code, benchmarking, and applications at SRS.

  18. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    Energy Technology Data Exchange (ETDEWEB)

    Böhlen, T.T.; Cerutti, F.; Chin, M.P.W. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Fassò, A. [ELI Beamlines, Harfa Office Park Ceskomoravská 2420/15a, 190 93 Prague 9 (Czech Republic); Ferrari, A., E-mail: alfredo.ferrari@cern.ch [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Ortega, P.G. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Mairani, A. [Unità di Fisica Medica, Fondazione CNAO, I-27100 Pavia (Italy); Sala, P.R. [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy); Smirnov, G.; Vlachoudis, V. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland)

    2014-06-15

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  19. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    CERN Document Server

    Böhlen, T T; Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Mairani, A; Sala, P R; Smirnov, G; Vlachoudis, V

    2014-01-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  20. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    Science.gov (United States)

    Böhlen, T. T.; Cerutti, F.; Chin, M. P. W.; Fassò, A.; Ferrari, A.; Ortega, P. G.; Mairani, A.; Sala, P. R.; Smirnov, G.; Vlachoudis, V.

    2014-06-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  1. New developments of the CARTE thermochemical code: Calculation of detonation properties of high explosives

    Science.gov (United States)

    Dubois, Vincent; Desbiens, Nicolas; Auroux, Eric

    2010-07-01

    We present the improvements of the CARTE thermochemical code which provides thermodynamic properties and chemical compositions of CHON systems over a large range of temperature and pressure with a very small computational cost. The detonation products are split in one or two fluid phase (s), treated with the MCRSR equation of state (EOS), and one condensed phase of carbon, modeled with a multiphase EOS which evolves with the chemical composition of the explosives. We have developed a new optimization procedure to obtain an accurate multicomponents EOS. We show here that the results of CARTE code are in good agreement with the specific data of molecular systems and measured detonation properties for several explosives.

  2. Spallation integral experiment analysis by high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Meigo, Shin-ichiro; Sasa, Toshinobu; Fukahori, Tokio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Yoshizawa, Nobuaki; Furihata, Shiori; Belyakov-Bodin, V.I.; Krupny, G.I.; Titarenko, Y.E.

    1997-03-01

    Reaction rate distributions were measured with various activation detectors on the cylindrical surface of the thick tungsten target of 20 cm in diameter and 60 cm in length bombarded with the 0.895 and 1.21 GeV protons. The experimental results were analyzed with the Monte Carlo simulation code systems of NMTC/JAERI-MCNP-4A, LAHET and HERMES. It is confirmed that those code systems can represent the reaction rate distributions with the C/E ratio of 0.6 to 1.4 at the positions up to 30 cm from beam incident surface. (author)

  3. Molecular cloning of a highly conserved mouse and human integral membrane protein (Itm1) and genetic mapping to mouse chromosome 9

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Guizhu; Tylzanowski, P. [Univ. of Antwerp (Belgium); Deleersnijder, W. [N.V. Innogenetics, Ghent (Belgium)] [and others

    1996-02-01

    We have isolated and characterized a novel cDNA coding for a highly hydrophobic protein (B5) from a fetal mouse mandibular condyle cDNA library. The full-length mouse B5 cDNA is 3095 nucleotides long and contains a potential open reading frame coding for a protein of 705 amino acids with a calculated molecular weight of 80.5 kDa. The B5 mRNA is differentially polyadenylated, with the most abundant transcript having a length of 2.7 kb. The human homolog of B5 was isolated from a cDNA testis library. The predicted amino acid sequence of the human B5 is 98.5% identical to that of mouse. The most striking feature of the B5 protein is the presence of numerous (10-14) potential transmembrane domains, characteristic of an integral membrane protein. Similarity searches in public databanks reveal that B5 is 58% similar to the T12A2.2 gene of Caenorhabditis elegans and 60% similar to the STT3 gene of Saccharomyces cerevisiae. Futhermore, the report of an EST sequence (Accession No. Z13858) related to the human B5, but identical to the STT3 gene, indicates that B5 belongs to a larger gene family coding for novel putative transmembrane proteins. This family exhibits a remarkable degree of conservation in different species. The gene for B5, designated Itm1 (Integral membrane protein 1), is located on mouse chromosome 9. 28 refs., 4 figs.

  4. Structural Code Considerations for Solar Rooftop Installations.

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, Stephen F.; Dwyer, Brian P.; Sanchez, Alfred

    2014-12-01

    Residential rooftop solar panel installations are limited in part by the high cost of structural related code requirements for field installation. Permitting solar installations is difficult because there is a belief among residential permitting authorities that typical residential rooftops may be structurally inadequate to support the additional load associated with a photovoltaic (PV) solar installation. Typical engineering methods utilized to calculate stresses on a roof structure involve simplifying assumptions that render a complex non-linear structure to a basic determinate beam. This method of analysis neglects the composite action of the entire roof structure, yielding a conservative analysis based on a rafter or top chord of a truss. Consequently, the analysis can result in an overly conservative structural analysis. A literature review was conducted to gain a better understanding of the conservative nature of the regulations and codes governing residential construction and the associated structural system calculations.

  5. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta;

    2011-01-01

    The paper addresses the problem of distribution of highdefinition video over fiber-wireless networks. The physical layer architecture with the low complexity envelope detection solution is investigated. We present both experimental studies and simulation of high quality high-definition compressed...... video transmission over 60 GHz fiberwireless link. Using advanced video coding we satisfy low complexity and low delay constraints, meanwhile preserving the superb video quality after significantly extended wireless distance. © 2011 Optical Society of America....

  6. EleCa: A Monte Carlo code for the propagation of extragalactic photons at ultra-high energy

    Energy Technology Data Exchange (ETDEWEB)

    Settimo, Mariangela [University of Siegen (Germany); De Domenico, Manlio [Laboratory of Complex Systems, Scuola Superiore di Catania and INFN (Italy); Lyberis, Haris [Federal University of Rio de Janeiro (Brazil)

    2013-06-15

    Ultra high energy photons, above 10{sup 17}–10{sup 18}eV, can interact with the extragalactic background radiation leading to the development of electromagnetic cascades. A Monte Carlo code to simulate the electromagnetic cascades initiated by high-energy photons and electrons is presented. Results from simulations and their impact on the predicted flux at Earth are discussed in different astrophysical scenarios.

  7. Reduced-order LPV model of flexible wind turbines from high fidelity aeroelastic codes

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher; Sønderby, Ivan Bergquist; Hansen, Morten Hartvig

    2013-01-01

    space. The obtained LPV model is of suitable size for designing modern gain-scheduling controllers based on recently developed LPV control design techniques. Results are thoroughly assessed on a set of industrial wind turbine models generated by the recently developed aeroelastic code HAWCStab2....

  8. A Three-Dimensional Eulerian Code for Simulation of High-Speed Multimaterial Interactions

    Science.gov (United States)

    2011-08-01

    Critical Research Technology List ( RW TPPCRTL), U.S. Munitions List ( USML ) (International Traffic in Arms Regulation (ITAR), 22 Code of Federal...RW TPPCRTL, USML , MCTL, and/or the CCL, does not meet the definition of Critical Technology as defined by DoDD 5230.25, and will not result in the

  9. Adaptive Network Coded Clouds: High Speed Downloads and Cost-Effective Version Control

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Heide, Janus; Roetter, Daniel Enrique Lucani

    2017-01-01

    developed a novel scheme using recoding with limited packets to trade-off storage space, reliability, and data retrieval speed. Implementation and measurements with commercial cloud providers show that up to 9x less network use is needed compared to other network coding schemes, while maintaining similar...

  10. Reliability studies of incident coding systems in high hazard industries: A narrative review of study methodology.

    Science.gov (United States)

    Olsen, Nikki S

    2013-03-01

    This paper reviews the current literature on incident coding system reliability and discusses the methods applied in the conduct and measurement of reliability. The search strategy targeted three electronic databases using a list of search terms and the results were examined for relevance, including any additional relevant articles from the bibliographies. Twenty five papers met the relevance criteria and their methods are discussed. Disagreements in the selection of methods between reliability researchers are highlighted as are the effects of method selection on the outcome of the trials. The review provides evidence that the meaningfulness of and confidence in results is directly affected by the methodologies employed by the researcher during the preparation, conduct and analysis of the reliability study. Furthermore, the review highlights the heterogeneity of methodologies employed by researchers measuring reliability of incident coding techniques, reducing the ability to critically compare and appraise techniques being considered for the adoption of report coding and trend analysis by client organisations. It is recommended that future research focuses on the standardisation of reliability research and measurement within the incident coding domain.

  11. Relationship between Holland High-Point Code and Client Preferences for Selected Vocational Counseling Strategies.

    Science.gov (United States)

    Boyd, Cynthia J.; Cramer, Stanley H.

    1995-01-01

    Undergraduates (n=208) with at least an eight-point difference between the first and second letters of their Holland type code completed the Self-Directed Search and Vocational Counseling Preference Inventory. Significant preference differences appeared among personality types in terms of counseling framework, career aspirations, and decision…

  12. Further progress on defining highly conserved immunogenic epitopes for a global HIV vaccine

    DEFF Research Database (Denmark)

    De Groot, Anne S; Levitz, Lauren; Ardito, Matthew T;

    2012-01-01

    and that are conserved in sequence and across time may represent the "Achilles' heel" of HIV and would be excellent candidates for vaccine development. In this study, T-cell epitopes were selected using immunoinformatics tools, combining HLA-A3 binding predictions with relative sequence conservation in the context...... of global HIV evolution. Twenty-seven HLA-A3 epitopes were chosen from an analysis performed in 2003 on 10,803 HIV-1 sequences, and additional sequences were selected in 2009 based on an expanded set of 43,822 sequences. These epitopes were tested in vitro for HLA binding and for immunogenicity with PBMCs...... of HIV-infected donors from Providence, Rhode Island. Validation of these HLA-A3 epitopes conserved across time, clades, and geography supports the hypothesis that epitopes such as these would be candidates for inclusion in our globally relevant GAIA HIV vaccine constructs....

  13. Role of highly conserved pyrimidine-rich sequences in the 3' untranslated region of the GAP-43 mRNA in mRNA stability and RNA-protein interactions.

    Science.gov (United States)

    Kohn, D T; Tsai, K C; Cansino, V V; Neve, R L; Perrone-Bizzozero, N I

    1996-03-01

    We have shown previously that the mRNA for the growth-associated protein GAP-43 is selectively stabilized during neuronal differentiation. In this study, we explored the role of its highly conserved 3' untranslated region (3'UTR) in mRNA stability and RNA-protein interactions. The 3'UTRs of the rat and chicken GAP-43 mRNAs show 78% sequence identity, which is equivalent to the conservation of their coding regions. In rat PC12 cells stably transfected with the full-length rat or chicken GAP-43 cDNAs, the transgene mRNAs decayed with same half-life of about 3 h. The GAP-43 3'UTR also caused the rabbit beta-globin mRNA to decay with a half-life of 4 h, indicating that the major determinants for GAP-43 mRNA stability are localized in its highly conserved 3'UTR. Three brain cytosolic RNA-binding proteins (molecular mass 40, 65 and 95 kDa) were found to interact with both the rat and chicken GAP-43 mRNAs. These RNA-protein interactions were specific and involved pyrimidine-rich sequences in the 3'UTR. Like the GAP-43 mRNA, the activity of these proteins was enriched in brain and increased during development. We propose that highly conserved pyrimidine-rich sequences in the 3'UTR of this mRNA regulate GAP-43 gene expression via interactions with specific RNA-binding proteins.

  14. High-frequency ultrasound for intraoperative margin assessments in breast conservation surgery: a feasibility study

    Directory of Open Access Journals (Sweden)

    Hart Vern P

    2011-10-01

    Full Text Available Abstract Background In addition to breast imaging, ultrasound offers the potential for characterizing and distinguishing between benign and malignant breast tissues due to their different microstructures and material properties. The aim of this study was to determine if high-frequency ultrasound (20-80 MHz can provide pathology sensitive measurements for the ex vivo detection of cancer in margins during breast conservation surgery. Methods Ultrasonic tests were performed on resected margins and other tissues obtained from 17 patients, resulting in 34 specimens that were classified into 15 pathology categories. Pulse-echo and through-transmission measurements were acquired from a total of 57 sites on the specimens using two single-element 50-MHz transducers. Ultrasonic attenuation and sound speed were obtained from time-domain waveforms. The waveforms were further processed with fast Fourier transforms to provide ultrasonic spectra and cepstra. The ultrasonic measurements and pathology types were analyzed for correlations. The specimens were additionally re-classified into five pathology types to determine specificity and sensitivity values. Results The density of peaks in the ultrasonic spectra, a measure of spectral structure, showed significantly higher values for carcinomas and precancerous pathologies such as atypical ductal hyperplasia than for normal tissue. The slopes of the cepstra for non-malignant pathologies displayed significantly greater values that differentiated them from the normal and malignant tissues. The attenuation coefficients were sensitive to fat necrosis, fibroadenoma, and invasive lobular carcinoma. Specificities and sensitivities for differentiating pathologies from normal tissue were 100% and 86% for lobular carcinomas, 100% and 74% for ductal carcinomas, 80% and 82% for benign pathologies, and 80% and 100% for fat necrosis and adenomas. Specificities and sensitivities were also determined for differentiating each

  15. On a consistent high-order finite difference scheme with kinetic energy conservation for simulating turbulent reacting flows

    Science.gov (United States)

    Trisjono, Philipp; Kang, Seongwon; Pitsch, Heinz

    2016-12-01

    The main objective of this study is to present an accurate and consistent numerical framework for turbulent reacting flows based on a high-order finite difference (HOFD) scheme. It was shown previously by Desjardins et al. (2008) [4] that a centered finite difference scheme discretely conserving the kinetic energy and an upwind-biased scheme for the scalar transport can be combined into a useful scheme for turbulent reacting flows. With a high-order spatial accuracy, however, an inconsistency among discretization schemes for different conservation laws is identified, which can disturb a scalar field spuriously under non-uniform density distribution. Various theoretical and numerical analyses are performed on the sources of the unphysical error. From this, the derivative of the mass-conserving velocity and the local Péclet number are identified as the primary factors affecting the error. As a solution, an HOFD stencil for the mass conservation is reformulated into a flux-based form that can be used consistently with an upwind-biased scheme for the scalar transport. The effectiveness of the proposed formulation is verified using two-dimensional laminar flows such as a scalar transport problem and a laminar premixed flame, where unphysical oscillations in the scalar fields are removed. The applicability of the proposed scheme is demonstrated in an LES of a turbulent stratified premixed flame.

  16. Linkage disequilibrium of evolutionarily conserved regions in the human genome

    Directory of Open Access Journals (Sweden)

    Johnson Todd A

    2006-12-01

    Full Text Available Abstract Background The strong linkage disequilibrium (LD recently found in genic or exonic regions of the human genome demonstrated that LD can be increased by evolutionary mechanisms that select for functionally important loci. This suggests that LD might be stronger in regions conserved among species than in non-conserved regions, since regions exposed to natural selection tend to be conserved. To assess this hypothesis, we used genome-wide polymorphism data from the HapMap project and investigated LD within DNA sequences conserved between the human and mouse genomes. Results Unexpectedly, we observed that LD was significantly weaker in conserved regions than in non-conserved regions. To investigate why, we examined sequence features that may distort the relationship between LD and conserved regions. We found that interspersed repeats, and not other sequence features, were associated with the weak LD tendency in conserved regions. To appropriately understand the relationship between LD and conserved regions, we removed the effect of repetitive elements and found that the high degree of sequence conservation was strongly associated with strong LD in coding regions but not with that in non-coding regions. Conclusion Our work demonstrates that the degree of sequence conservation does not simply increase LD as predicted by the hypothesis. Rather, it implies that purifying selection changes the polymorphic patterns of coding sequences but has little influence on the patterns of functional units such as regulatory elements present in non-coding regions, since the former are generally restricted by the constraint of maintaining a functional protein product across multiple exons while the latter may exist more as individually isolated units.

  17. Code domains in tandem repetitive DNA sequence structures.

    Science.gov (United States)

    Vogt, P

    1992-10-01

    Traditionally, many people doing research in molecular biology attribute coding properties to a given DNA sequence if this sequence contains an open reading frame for translation into a sequence of amino acids. This protein coding capability of DNA was detected about 30 years ago. The underlying genetic code is highly conserved and present in every biological species studied so far. Today, it is obvious that DNA has a much larger coding potential for other important tasks. Apart from coding for specific RNA molecules such as rRNA, snRNA and tRNA molecules, specific structural and sequence patterns of the DNA chain itself express distinct codes for the regulation and expression of its genetic activity. A chromatin code has been defined for phasing of the histone-octamer protein complex in the nucleosome. A translation frame code has been shown to exist that determines correct triplet counting at the ribosome during protein synthesis. A loop code seems to organize the single stranded interaction of the nascent RNA chain with proteins during the splicing process, and a splicing code phases successive 5' and 3' splicing sites. Most of these DNA codes are not exclusively based on the primary DNA sequence itself, but also seem to include specific features of the corresponding higher order structures. Based on the view that these various DNA codes are genetically instructive for specific molecular interactions or processes, important in the nucleus during interphase and during cell division, the coding capability of tandem repetitive DNA sequences has recently been reconsidered.

  18. Using the Eastern Hellbender Salamander in a High School Genetics & Ecological Conservation Activity

    Science.gov (United States)

    Chudyk, Sarah; McMillan, Amy; Lange, Catherine

    2014-01-01

    This article contains an original 5E lesson plan developed from conservation genetics research on the giant North American hellbender salamander, Cryptobranchus alleganiensis alleganiensis. The lesson plan provides background information on the hellbender, reviews basic genetics, and exposes students to the scientific process that is used during…

  19. Geometrid moth assemblages reflect high conservation value of naturally regenerated secondary forests in temperate China

    NARCIS (Netherlands)

    Zou, Yi; Sang, Weiguo; Warren-Thomas, Eleanor; Axmacher, Jan Christoph

    2016-01-01

    The widespread destruction of mature forests in China has led to massive ecological degradation, counteracted in recent decades by substantial efforts to promote forest plantations and protect secondary forest ecosystems. The value of the resulting forests for biodiversity conservation is widely

  20. The Clawpack Community of Codes

    Science.gov (United States)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  1. Applying informed coding and embedding to design a robust high-capacity watermark.

    Science.gov (United States)

    Miller, Matt L; Doërr, Gwenaël J; Cox, Ingemar J

    2004-06-01

    We describe a new watermarking system based on the principles of informed coding and informed embedding. This system is capable of embedding 1380 bits of information in images with dimensions 240 x 368 pixels. Experiments on 2000 images indicate the watermarks are robust to significant valumetric distortions, including additive noise, low-pass filtering, changes in contrast, and lossy compression. Our system encodes watermark messages with a modified trellis code in which a given message may be represented by a variety of different signals, with the embedded signal selected according to the cover image. The signal is embedded by an iterative method that seeks to ensure the message will not be confused with other messages, even after addition of noise. Fidelity is improved by the incorporation of perceptual shaping into the embedding process. We show that each of these three components improves performance substantially.

  2. Research on High-Frequency Combination Coding-Based SSVEP-BCIs and Its Signal Processing Algorithms

    Directory of Open Access Journals (Sweden)

    Feng Zhang

    2015-01-01

    Full Text Available This study presents a new steady-state visual evoked potential (SSVEP paradigm for brain computer interface (BCI systems. The new paradigm is High-Frequency Combination Coding-Based SSVEP (HFCC-SSVEP. The goal of this study is to increase the number of targets using fewer stimulation frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. This paper investigated the HFCC-SSVEP high-frequency response (beyond 25 Hz for 3 frequencies (25 Hz, 33.33 Hz, and 40 Hz. HFCC-SSVEP produces nn with n high stimulation frequencies through Time Series Combination Code. Furthermore, The Improved Hilbert-Huang Transform (IHHT is adopted to extract time-frequency feature of the proposed SSVEP response. Lastly, the differentiation combination (DC method is proposed to select the combination coding sequence in order to increase the recognition rate; as a result, IHHT algorithm and DC method for the proposed SSVEP paradigm in this study increase recognition efficiency so as to improve ITR and increase the stability of the BCI system. Furthermore, SSVEPs evoked by high-frequency stimuli (beyond 25 Hz minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures. This study tests five subjects in order to verify the feasibility of the proposed method.

  3. Collisions of electrons with hydrogen atoms I. Package outline and high energy code

    Science.gov (United States)

    Benda, Jakub; Houfek, Karel

    2014-11-01

    Being motivated by the applied researchers’ persisting need for accurate scattering data for the collisions of electrons with hydrogen atoms, we developed a computer package-Hex-that is designed to provide trustworthy results for all basic discrete and continuous processes within non-relativistic framework. The package consists of several computational modules that implement different methods, valid for specific energy regimes. Results of the modules are kept in a common database in the unified form of low-level scattering data (partial-wave T-matrices) and accessed by an interface program which is able to produce various derived quantities like e.g. differential and integral cross sections. This article is the first one of a series of articles that are concerned with the implementation and testing of the modules. Here we give an overview of their structure and present (a) the command-line interface program hex-db that can be also easily compiled into a derived code or used as a backend for a web-page form and (b) simple illustrative module specialized for high energies, hex-dwba, that implements distorted and plane wave Born approximation. Catalogue identifier: AETH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETH_v1_0.html Program obtainable from: CPC Program library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data etc.: 30367 No. of bytes in distributed program, including test data etc.: 232032 Distribution format: tar.gz Programming language: C++11 Operating system: Any system with a C++11 compiler (e.g. GCC 4.8.1; tested on OpenSUSE 13.1 and Windows 8). RAM: Test run 3 MiB. CPC Library Classification: 2.4 Electron scattering External libraries:GSL [49], FFTW3[52], SQLite3 [46]. All of the libraries are open-source and maintained. Nature of problem: Extraction of derived (observable) quantities from partial

  4. High speed, low-complexity image coding for IP-transport with JPEG XS

    Science.gov (United States)

    Richter, Thomas; Fößel, Siegfried; Keinert, Joachim; Descampe, Antonin

    2016-09-01

    The JPEG committee (formally, ISO/IEC SC 29 WG 01) is currently investigating a new work item on near lossless low complexity coding for IP streaming of moving images. This article discusses the requirements and use cases of this work item, gives some insight into the anchors that are used for the purpose of standardization, and provides a short update on the current proposals that reached the committee.

  5. LIDAR pulse coding for high resolution range imaging at improved refresh rate.

    Science.gov (United States)

    Kim, Gunzung; Park, Yongwan

    2016-10-17

    In this study, a light detection and ranging system (LIDAR) was designed that codes pixel location information in its laser pulses using the direct- sequence optical code division multiple access (DS-OCDMA) method in conjunction with a scanning-based microelectromechanical system (MEMS) mirror. This LIDAR can constantly measure the distance without idle listening time for the return of reflected waves because its laser pulses include pixel location information encoded by applying the DS-OCDMA. Therefore, this emits in each bearing direction without waiting for the reflected wave to return. The MEMS mirror is used to deflect and steer the coded laser pulses in the desired bearing direction. The receiver digitizes the received reflected pulses using a low-temperature-grown (LTG) indium gallium arsenide (InGaAs) based photoconductive antenna (PCA) and the time-to-digital converter (TDC) and demodulates them using the DS-OCDMA. When all of the reflected waves corresponding to the pixels forming a range image are received, the proposed LIDAR generates a point cloud based on the time-of-flight (ToF) of each reflected wave. The results of simulations performed on the proposed LIDAR are compared with simulations of existing LIDARs.

  6. THR-TH: a high-temperature gas-cooled nuclear reactor core thermal hydraulics code

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1984-07-01

    The ORNL version of PEBBLE, the (RZ) pebble bed thermal hydraulics code, has been extended for application to a prismatic gas cooled reactor core. The supplemental treatment is of one-dimensional coolant flow in up to a three-dimensional core description. Power density data from a neutronics and exposure calculation are used as the basic information for the thermal hydraulics calculation of heat removal. Two-dimensional neutronics results may be expanded for a three-dimensional hydraulics calculation. The geometric description for the hydraulics problem is the same as used by the neutronics code. A two-dimensional thermal cell model is used to predict temperatures in the fuel channel. The capability is available in the local BOLD VENTURE computation system for reactor core analysis with capability to account for the effect of temperature feedback by nuclear cross section correlation. Some enhancements have also been added to the original code to add pebble bed modeling flexibility and to generate useful auxiliary results. For example, an estimate is made of the distribution of fuel temperatures based on average and extreme conditions regularly calculated at a number of locations.

  7. Conservation hotspots for the turtles on the high seas of the Atlantic Ocean.

    Directory of Open Access Journals (Sweden)

    Hsiang-Wen Huang

    Full Text Available Understanding the distribution of bycaught sea turtles could inform conservation strategies and priorities. This research analyses the distribution of turtles caught as longline fisheries bycatch on the high seas of the Atlantic Ocean. This research collected 18,142 bycatch observations and 47.1 million hooks from large-scale Taiwanese longline vessels in the Atlantic Ocean from June 2002 to December 2013. The coverage rates were ranged from 0.48% to 17.54% by year. Seven hundred and sixty-seven turtles were caught, and the major species were leatherback (59.8%, olive ridley (27.1% and loggerhead turtles (8.7%. Most olive ridley (81.7% and loggerhead (82.1% turtles were hooked, while the leatherbacks were both hooked (44.0% and entangled (31.8%. Depending on the species, 21.4% to 57.7% were dead when brought onboard. Most of the turtles were caught in tropical areas, especially in the Gulf of Guinea (15°N-10°S, 30°W-10°E, but loggerheads were caught in the south Atlantic Ocean (25°S-35°S, 40°W-10°E and 30°S-40°S, 55°W-45°W. The bycatch rate was the highest at 0.030 per 1000 hooks for leatherbacks in the tropical area. The bycatch rates of olive ridley ranged from 0 to 0.010 per thousand hooks. The loggerhead bycatch rates were higher in the northern and southern Atlantic Ocean and ranged from 0.0128 to 0.0239 per thousand hooks. Due to the characteristics of the Taiwanese deep-set longline fleet, bycatch rates were lower than those of coastal longline fisheries, but mortality rates were higher because of the long hours of operation. Gear and bait modification should be considered to reduce sea turtle bycatch and increase survival rates while reducing the use of shallow hooks would also be helpful.

  8. Conservation Hotspots for the Turtles on the High Seas of the Atlantic Ocean

    Science.gov (United States)

    Huang, Hsiang-Wen

    2015-01-01

    Understanding the distribution of bycaught sea turtles could inform conservation strategies and priorities. This research analyses the distribution of turtles caught as longline fisheries bycatch on the high seas of the Atlantic Ocean. This research collected 18,142 bycatch observations and 47.1 million hooks from large-scale Taiwanese longline vessels in the Atlantic Ocean from June 2002 to December 2013. The coverage rates were ranged from 0.48% to 17.54% by year. Seven hundred and sixty-seven turtles were caught, and the major species were leatherback (59.8%), olive ridley (27.1%) and loggerhead turtles (8.7%). Most olive ridley (81.7%) and loggerhead (82.1%) turtles were hooked, while the leatherbacks were both hooked (44.0%) and entangled (31.8%). Depending on the species, 21.4% to 57.7% were dead when brought onboard. Most of the turtles were caught in tropical areas, especially in the Gulf of Guinea (15°N-10°S, 30°W-10°E), but loggerheads were caught in the south Atlantic Ocean (25°S-35°S, 40°W-10°E and 30°S-40°S, 55°W-45°W). The bycatch rate was the highest at 0.030 per 1000 hooks for leatherbacks in the tropical area. The bycatch rates of olive ridley ranged from 0 to 0.010 per thousand hooks. The loggerhead bycatch rates were higher in the northern and southern Atlantic Ocean and ranged from 0.0128 to 0.0239 per thousand hooks. Due to the characteristics of the Taiwanese deep-set longline fleet, bycatch rates were lower than those of coastal longline fisheries, but mortality rates were higher because of the long hours of operation. Gear and bait modification should be considered to reduce sea turtle bycatch and increase survival rates while reducing the use of shallow hooks would also be helpful. PMID:26267796

  9. Conservation hotspots for the turtles on the high seas of the Atlantic Ocean.

    Science.gov (United States)

    Huang, Hsiang-Wen

    2015-01-01

    Understanding the distribution of bycaught sea turtles could inform conservation strategies and priorities. This research analyses the distribution of turtles caught as longline fisheries bycatch on the high seas of the Atlantic Ocean. This research collected 18,142 bycatch observations and 47.1 million hooks from large-scale Taiwanese longline vessels in the Atlantic Ocean from June 2002 to December 2013. The coverage rates were ranged from 0.48% to 17.54% by year. Seven hundred and sixty-seven turtles were caught, and the major species were leatherback (59.8%), olive ridley (27.1%) and loggerhead turtles (8.7%). Most olive ridley (81.7%) and loggerhead (82.1%) turtles were hooked, while the leatherbacks were both hooked (44.0%) and entangled (31.8%). Depending on the species, 21.4% to 57.7% were dead when brought onboard. Most of the turtles were caught in tropical areas, especially in the Gulf of Guinea (15°N-10°S, 30°W-10°E), but loggerheads were caught in the south Atlantic Ocean (25°S-35°S, 40°W-10°E and 30°S-40°S, 55°W-45°W). The bycatch rate was the highest at 0.030 per 1000 hooks for leatherbacks in the tropical area. The bycatch rates of olive ridley ranged from 0 to 0.010 per thousand hooks. The loggerhead bycatch rates were higher in the northern and southern Atlantic Ocean and ranged from 0.0128 to 0.0239 per thousand hooks. Due to the characteristics of the Taiwanese deep-set longline fleet, bycatch rates were lower than those of coastal longline fisheries, but mortality rates were higher because of the long hours of operation. Gear and bait modification should be considered to reduce sea turtle bycatch and increase survival rates while reducing the use of shallow hooks would also be helpful.

  10. DEVELOPMENT OF ASME SECTION X CODE RULES FOR HIGH PRESSURE COMPOSITE HYDROGEN PRESSURE VESSELS WITH NON-LOAD SHARING LINERS

    Energy Technology Data Exchange (ETDEWEB)

    Rawls, G.; Newhouse, N.; Rana, M.; Shelley, B.; Gorman, M.

    2010-04-13

    The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPa (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.

  11. Ultrasonic Imaging in Highly Attenuating Materials With Hadamard Codes and the Decomposition of the Time Reversal Operator.

    Science.gov (United States)

    Lopez Villaverde, Eduardo; Robert, Sebastien; Prada, Claire

    2017-09-01

    In this paper, defects in a high density polyethylene pipe are imaged with the total focusing method. The viscoelastic attenuation of this material greatly reduces the signal level and leads to a poor signal-to-noise ratio (SNR) due to electronic noise. To improve the image quality, the decomposition of the time reversal operator method is combined with the spatial Hadamard coded transmissions before calculating images in the time domain. Because the Hadamard coding is not compatible with conventional imaging systems, this paper proposes two modified coding methods based on sparse Hadamard matrices with +1/0 coefficients. The SNRs expected with the different spatial codes are demonstrated, and then validated on both simulated and experimental data. Experiments are performed with a transducer array in contact with the base material of a polyethylene pipe. In order to improve the noise filtering procedure, the singular values associated with electronic noise are expressed on the basis of the random matrix theory. This model of noise singular values allows a better identification of the defect response in noisy experimental data. Finally, the imaging method is evaluated in a more industrial inspection configuration, where an immersion array probe is used to image defects in a butt fusion weld with a complex geometry.

  12. Inhibitory control and visuo-spatial reversibility in Piaget's seminal number conservation task: a high-density ERP study.

    Science.gov (United States)

    Borst, Grégoire; Simon, Grégory; Vidal, Julie; Houdé, Olivier

    2013-01-01

    The present high-density event-related potential (ERP) study on 13 adults aimed to determine whether number conservation relies on the ability to inhibit the overlearned length-equals-number strategy and then imagine the shortening of the row that was lengthened. Participants performed the number-conservation task and, after the EEG session, the mental imagery task. In the number-conservation task, first two rows with the same number of tokens and the same length were presented on a computer screen (COV condition) and then, the tokens in one of the two rows were spread apart (INT condition). Participants were instructed to determine whether the two rows had an identical number of tokens. In the mental imagery task, two rows with different lengths but the same number of tokens were presented and participants were instructed to imagine the tokens in the longer row aligning with the tokens in the shorter row. In the number-conservation task, we found that the amplitudes of the centro-parietal N2 and fronto-central P3 were higher in the INT than in the COV conditions. In addition, the differences in response times between the two conditions were correlated with the differences in the amplitudes of the fronto-central P3. In light of previous results reported on the number-conservation task in adults, the present results suggest that inhibition might be necessary to succeed the number-conservation task in adults even when the transformation of the length of one of the row is displayed. Finally, we also reported correlations between the speed at which participants could imagine the shortening of one of the row in the mental imagery task, the speed at which participants could determine that the two rows had the same number of tokens after the tokens in one of the row were spread apart and the latency of the late positive parietal component in the number-conservation task. Therefore, performing the number-conservation task might involve mental transformation processes in

  13. Applications of very high-resolution imagery in the study and conservation of large predators in the Southern Ocean.

    Science.gov (United States)

    Larue, Michelle A; Knight, Joseph

    2014-12-01

    The Southern Ocean is one of the most rapidly changing ecosystems on the planet due to the effects of climate change and commercial fishing for ecologically important krill and fish. Because sea ice loss is expected to be accompanied by declines in krill and fish predators, decoupling the effects of climate and anthropogenic changes on these predator populations is crucial for ecosystem-based management of the Southern Ocean. We reviewed research published from 2007 to 2014 that incorporated very high-resolution satellite imagery to assess distribution, abundance, and effects of climate and other anthropogenic changes on populations of predators in polar regions. Very high-resolution imagery has been used to study 7 species of polar animals in 13 papers, many of which provide methods through which further research can be conducted. Use of very high-resolution imagery in the Southern Ocean can provide a broader understanding of climate and anthropogenic forces on populations and inform management and conservation recommendations. We recommend that conservation biologists continue to integrate high-resolution remote sensing into broad-scale biodiversity and population studies in remote areas, where it can provide much needed detail. © 2014 Society for Conservation Biology.

  14. High-Quality 3d Models and Their Use in a Cultural Heritage Conservation Project

    Science.gov (United States)

    Tucci, G.; Bonora, V.; Conti, A.; Fiorini, L.

    2017-08-01

    Cultural heritage digitization and 3D modelling processes are mainly based on laser scanning and digital photogrammetry techniques to produce complete, detailed and photorealistic three-dimensional surveys: geometric as well as chromatic aspects, in turn testimony of materials, work techniques, state of preservation, etc., are documented using digitization processes. The paper explores the topic of 3D documentation for conservation purposes; it analyses how geomatics contributes in different steps of a restoration process and it presents an overview of different uses of 3D models for the conservation and enhancement of the cultural heritage. The paper reports on the project to digitize the earthenware frieze of the Ospedale del Ceppo in Pistoia (Italy) for 3D documentation, restoration work support, and digital and physical reconstruction and integration purposes. The intent to design an exhibition area suggests new ways to take advantage of 3D data originally acquired for documentation and scientific purposes.

  15. High Resolution Mapping of Soils and Landforms for the Desert Renewable Energy Conservation Plan (DRECP)

    Science.gov (United States)

    Potter, Christopher S.; Li, Shuang

    2014-01-01

    The Desert Renewable Energy Conservation Plan (DRECP), a major component of California's renewable energy planning efforts, is intended to provide effective protection and conservation of desert ecosystems, while allowing for the sensible development of renewable energy projects. This NASA mapping report was developed to support the DRECP and the Bureau of Land Management (BLM). We outline in this document remote sensing image processing methods to deliver new maps of biological soils crusts, sand dune movements, desert pavements, and sub-surface water sources across the DRECP area. We focused data processing first on the largely unmapped areas most likely to be used for energy developments, such as those within Renewable Energy Study Areas (RESA) and Solar Energy Zones (SEZs). We used imagery (multispectral and radar) mainly from the years 2009-2011.

  16. Concentration of specific amino acids at the catalytic/active centers of highly-conserved "housekeeping" enzymes of central metabolism in archaea, bacteria and Eukaryota: is there a widely conserved chemical signal of prebiotic assembly?

    Science.gov (United States)

    Pollack, J Dennis; Pan, Xueliang; Pearl, Dennis K

    2010-06-01

    In alignments of 1969 protein sequences the amino acid glycine and others were found concentrated at most-conserved sites within approximately 15 A of catalytic/active centers (C/AC) of highly conserved kinases, dehydrogenases or lyases of Archaea, Bacteria and Eukaryota. Lysine and glutamic acid were concentrated at least-conserved sites furthest from their C/ACs. Logistic-regression analyses corroborated the "movement" of glycine towards and lysine away from their C/ACs: the odds of a glycine occupying a site were decreased by 19%, while the odds for a lysine were increased by 53%, for every 10 A moving away from the C/AC. Average conservation of MSA consensus sites was highest surrounding the C/AC and directly decreased in transition toward model's peripheries. Findings held with statistical confidence using sequences restricted to individual Domains or enzyme classes or to both. Our data describe variability in the rate of mutation and likelihoods for phylogenetic trees based on protein sequence data and endorse the extension of substitution models by incorporating data on conservation and distance to C/ACs rather than only using cumulative levels. The data support the view that in the most-conserved environment immediately surrounding the C/AC of taxonomically distant and highly conserved essential enzymes of central metabolism there are amino acids whose identity and degree of occupancy is similar to a proposed amino acid set and frequency associated with prebiotic evolution.

  17. A High-Accuracy Linear Conservative Difference Scheme for Rosenau-RLW Equation

    Directory of Open Access Journals (Sweden)

    Jinsong Hu

    2013-01-01

    Full Text Available We study the initial-boundary value problem for Rosenau-RLW equation. We propose a three-level linear finite difference scheme, which has the theoretical accuracy of Oτ2+h4. The scheme simulates two conservative properties of original problem well. The existence, uniqueness of difference solution, and a priori estimates in infinite norm are obtained. Furthermore, we analyze the convergence and stability of the scheme by energy method. At last, numerical experiments demonstrate the theoretical results.

  18. Recent applications of the transonic wing analysis computer code, TWING

    Science.gov (United States)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  19. Use of a Viscous Flow Simulation Code for Static Aeroelastic Analysis of a Wing at High-Lift Conditions

    Science.gov (United States)

    Akaydin, H. Dogus; Moini-Yekta, Shayan; Housman, Jeffrey A.; Nguyen, Nhan

    2015-01-01

    In this paper, we present a static aeroelastic analysis of a wind tunnel test model of a wing in high-lift configuration using a viscous flow simulation code. The model wing was tailored to deform during the tests by amounts similar to a composite airliner wing in highlift conditions. This required use of a viscous flow analysis to predict the lift coefficient of the deformed wing accurately. We thus utilized an existing static aeroelastic analysis framework that involves an inviscid flow code (Cart3d) to predict the deformed shape of the wing, then utilized a viscous flow code (Overflow) to compute the aerodynamic loads on the deformed wing. This way, we reduced the cost of flow simulations needed for this analysis while still being able to predict the aerodynamic forces with reasonable accuracy. Our results suggest that the lift of the deformed wing may be higher or lower than that of the non-deformed wing, and the washout deformation of the wing is the key factor that changes the lift of the deformed wing in two distinct ways: while it decreases the lift at low to moderate angles of attack simply by lowering local angles of attack along the span, it increases the lift at high angles of attack by alleviating separation.

  20. Fundamental algorithm and computational codes for the light beam propagation in high power laser system

    Institute of Scientific and Technical Information of China (English)

    GUO; Hong

    2001-01-01

    [1]Sacks, R. A., The PROP 92 Fourier Beam Propagation Code, UCRL-LR-105821-96-4.[2]Williams, W. H., Modeling of Self-Focusing Experiments by Beam Propagation Codes, UCRL-LR-105821-96-1.[3]User guide for FRESNEL software.[4]Hunt, J. H., Renard, P. A., Simmons, W. W., Improved performance of fusion lasers using the imaging properties of multiple spatial filters, Appl. Opt., 1977, 16: 779.[5]Deng Ximing, Guo Hong, Cao Qing, Invariant integral and statistical equations for the paraxial beam propagation in free space, Science in China (in Chinese) Ser. A, 1997, 27(1): 64.[6]Goodman, J. W., Introduction to Fourier Optics, New York: McGraw-Hill, 1968.[7]Born, M., Wolf, E., Principles of Optics, New York: Pergamon Press, 1975.[8]Siegman, A. E., Lasers, New York: Mill Valley CA, 1986.[9]Fan Dianyuan, Fresnel number of complex system, Optica Sinica (in Chinese), 1983, 3(4): 319.[10]L

  1. Trellis-coded MPSK modulation for highly efficient military satellite applications

    Science.gov (United States)

    Viterbi, Andrew J.; Wolf, Jack K.; Zehavi, Ephraim

    Trellis-coded multiple-phase-shift-keyed (MPSK) modulation is an effective technique for increasing the bandwidth efficiency of an existing channel while maintaining at least a moderate degree of power efficiency through coding. The authors consider the application of this technique to increase markedly the capacity of a 25-kHz military satellite channel. It is shown that with only minor modifications to the QPSK modem to incorporate 8 PSK and 16 PSK modulation and no modification to the rate-1/2 coder used to transmit 16 kb/s over this channel, transmission rates of 32 kb/s and 48 kb/s, can be supported at E(b)/N(0) levels only moderately higher than for the lower data rate. In fact, it is demonstrated that 48 Kb/s can be transmitted within exactly the same bandwidth and at the same E(b)/N(0) levels as that required to transmit uncoded QPSK at 32 kb/s at bit error rate of 10-5.

  2. HOTB: High precision parallel code for calculation of four-particle harmonic oscillator transformation brackets

    Science.gov (United States)

    Stepšys, A.; Mickevicius, S.; Germanas, D.; Kalinauskas, R. K.

    2014-11-01

    This new version of the HOTB program for calculation of the three and four particle harmonic oscillator transformation brackets provides some enhancements and corrections to the earlier version (Germanas et al., 2010) [1]. In particular, new version allows calculations of harmonic oscillator transformation brackets be performed in parallel using MPI parallel communication standard. Moreover, higher precision of intermediate calculations using GNU Quadruple Precision and arbitrary precision library FMLib [2] is done. A package of Fortran code is presented. Calculation time of large matrices can be significantly reduced using effective parallel code. Use of Higher Precision methods in intermediate calculations increases the stability of algorithms and extends the validity of used algorithms for larger input values. Catalogue identifier: AEFQ_v4_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEFQ_v4_0.html Program obtainable from: CPC Program Library, Queen’s University of Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 Number of lines in programs, including test data, etc.: 1711 Number of bytes in distributed programs, including test data, etc.: 11667 Distribution format: tar.gz Program language used: FORTRAN 90 with MPI extensions for parallelism Computer: Any computer with FORTRAN 90 compiler Operating system: Windows, Linux, FreeBSD, True64 Unix Has the code been vectorized of parallelized?: Yes, parallelism using MPI extensions. Number of CPUs used: up to 999 RAM(per CPU core): Depending on allocated binomial and trinomial matrices and use of precision; at least 500 MB Catalogue identifier of previous version: AEFQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181, Issue 2, (2010) 420-425 Does the new version supersede the previous version? Yes Nature of problem: Calculation of matrices of three-particle harmonic oscillator brackets (3HOB) and four-particle harmonic oscillator brackets (4HOB) in a more

  3. Research on High-Frequency Combination Coding-Based SSVEP-BCIs and Its Signal Processing Algorithms

    OpenAIRE

    Feng Zhang; Chengcheng Han; Lili Li; Xin Zhang; Jun Xie; Yeping Li

    2015-01-01

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The new paradigm is High-Frequency Combination Coding-Based SSVEP (HFCC-SSVEP). The goal of this study is to increase the number of targets using fewer stimulation frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. This paper investigated the HFCC-SSVEP high-frequency response (beyond 25 Hz) for 3 frequencies (25 H...

  4. Identification of novel non-coding small RNAs from Streptococcus pneumoniae TIGR4 using high-resolution genome tiling arrays

    Directory of Open Access Journals (Sweden)

    Swiatlo Edwin

    2010-06-01

    Full Text Available Abstract Background The identification of non-coding transcripts in human, mouse, and Escherichia coli has revealed their widespread occurrence and functional importance in both eukaryotic and prokaryotic life. In prokaryotes, studies have shown that non-coding transcripts participate in a broad range of cellular functions like gene regulation, stress and virulence. However, very little is known about non-coding transcripts in Streptococcus pneumoniae (pneumococcus, an obligate human respiratory pathogen responsible for significant worldwide morbidity and mortality. Tiling microarrays enable genome wide mRNA profiling as well as identification of novel transcripts at a high-resolution. Results Here, we describe a high-resolution transcription map of the S. pneumoniae clinical isolate TIGR4 using genomic tiling arrays. Our results indicate that approximately 66% of the genome is expressed under our experimental conditions. We identified a total of 50 non-coding small RNAs (sRNAs from the intergenic regions, of which 36 had no predicted function. Half of the identified sRNA sequences were found to be unique to S. pneumoniae genome. We identified eight overrepresented sequence motifs among sRNA sequences that correspond to sRNAs in different functional categories. Tiling arrays also identified approximately 202 operon structures in the genome. Conclusions In summary, the pneumococcal operon structures and novel sRNAs identified in this study enhance our understanding of the complexity and extent of the pneumococcal 'expressed' genome. Furthermore, the results of this study open up new avenues of research for understanding the complex RNA regulatory network governing S. pneumoniae physiology and virulence.

  5. Forest edges have high conservation value for bird communities in mosaic landscapes.

    Science.gov (United States)

    Terraube, Julien; Archaux, Frédéric; Deconchat, Marc; van Halder, Inge; Jactel, Hervé; Barbaro, Luc

    2016-08-01

    A major conservation challenge in mosaic landscapes is to understand how trait-specific responses to habitat edges affect bird communities, including potential cascading effects on bird functions providing ecosystem services to forests, such as pest control. Here, we examined how bird species richness, abundance and community composition varied from interior forest habitats and their edges into adjacent open habitats, within a multi-regional sampling scheme. We further analyzed variations in Conservation Value Index (CVI), Community Specialization Index (CSI) and functional traits across the forest-edge-open habitat gradient. Bird species richness, total abundance and CVI were significantly higher at forest edges while CSI peaked at interior open habitats, i.e., furthest from forest edge. In addition, there were important variations in trait- and species-specific responses to forest edges among bird communities. Positive responses to forest edges were found for several forest bird species with unfavorable conservation status. These species were in general insectivores, understorey gleaners, cavity nesters and long-distance migrants, all traits that displayed higher abundance at forest edges than in forest interiors or adjacent open habitats. Furthermore, consistently with predictions, negative edge effects were recorded in some forest specialist birds and in most open-habitat birds, showing increasing densities from edges to interior habitats. We thus suggest that increasing landscape-scale habitat complexity would be beneficial to declining species living in mosaic landscapes combining small woodlands and open habitats. Edge effects between forests and adjacent open habitats may also favor bird functional guilds providing valuable ecosystem services to forests in longstanding fragmented landscapes.

  6. High qualitative and quantitative conservation of alternative splicing in Caenorhabditis elegans and Caenorhabditis briggsae

    DEFF Research Database (Denmark)

    Rukov, Jakob Lewin; Irimia, Manuel; Mørk, Søren

    2007-01-01

    Alternative splicing (AS) is an important contributor to proteome diversity and is regarded as an explanatory factor for the relatively low number of human genes compared with less complex animals. To assess the evolutionary conservation of AS and its developmental regulation, we have investigated...... that the quantitative regulation of isoform expression levels is an intrinsic part of most AS events. Moreover, our results indicate that AS contributes little to transcript variation in Caenorhabditis genes and that gene duplication may be the major evolutionary mechanism for the origin of novel transcripts in these 2...

  7. Calcified plaque resorptive status as determined by high-resolution ultrasound is predictive of successful conservative management of calcific tendinosis.

    Science.gov (United States)

    Lin, Chien-Hung; Chao, Hai-Lun; Chiou, Hong-Jen

    2012-08-01

    In patients with calcific tendinosis, the morphology of calcified plaques is associated with response to conservative management. We aimed to determine changes in pain and morphology of plaques in patients with calcific tendinosis and non-arc-shaped plaques identified by high-resolution ultrasonography who received only conservative treatment. A total of 33 patients with a mean age of 63.3±10.3 years were included. Pain scores at the time of first and follow-up ultrasound were recorded, and the degree of plaque resolution was calculated. At follow-up, 90.9% (30 of 33) of patients reported improvement in pain, and 84.8% (28 of 33) patient had more than 50% elimination of plaques. Most of increased vascularity observed in color Doppler ultrasonography during 1st visit disappeared at follow-up. In patients with calcific tendinosis, non-arc-shaped plaques determined by high-resolution ultrasonography are likely to resolve and conservative management is warranted. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Calcified plaque resorptive status as determined by high-resolution ultrasound is predictive of successful conservative management of calcific tendinosis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Chien-Hung [Department of Diagnostic Radiology, Chi-Mei Medical Center, Yung Kang City, Tainan, Taiwan (China); Department of Health Care Administration, Chung-Hwa University of Medical Technology, Tainan, Taiwan (China); Chao, Hai-Lun [Department of Health Care Administration, Chung-Hwa University of Medical Technology, Tainan, Taiwan (China); Chiou, Hong-Jen, E-mail: hjchiou@vghtpe.gov.tw [Department of Radiology, Taipei Veterans General Hospital, National Yang Ming University, School of Medicine, and National Defense Medical Center, No. 201, Sec. 2, Shih-Pai Rd., Taipei 11217, Taiwan (China)

    2012-08-15

    Objective: In patients with calcific tendinosis, the morphology of calcified plaques is associated with response to conservative management. We aimed to determine changes in pain and morphology of plaques in patients with calcific tendinosis and non-arc-shaped plaques identified by high-resolution ultrasonography who received only conservative treatment. Methods: A total of 33 patients with a mean age of 63.3 {+-} 10.3 years were included. Pain scores at the time of first and follow-up ultrasound were recorded, and the degree of plaque resolution was calculated. Results: At follow-up, 90.9% (30 of 33) of patients reported improvement in pain, and 84.8% (28 of 33) patient had more than 50% elimination of plaques. Most of increased vascularity observed in color Doppler ultrasonography during 1st visit disappeared at follow-up. Conclusions: In patients with calcific tendinosis, non-arc-shaped plaques determined by high-resolution ultrasonography are likely to resolve and conservative management is warranted.

  9. Application of RS Codes in Decoding QR Code

    Institute of Scientific and Technical Information of China (English)

    Zhu Suxia(朱素霞); Ji Zhenzhou; Cao Zhiyan

    2003-01-01

    The QR Code is a 2-dimensional matrix code with high error correction capability. It employs RS codes to generate error correction codewords in encoding and recover errors and damages in decoding. This paper presents several QR Code's virtues, analyzes RS decoding algorithm and gives a software flow chart of decoding the QR Code with RS decoding algorithm.

  10. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes.

    Science.gov (United States)

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-10-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed.

  11. The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy

    CERN Document Server

    Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F

    2010-01-01

    Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...

  12. Low Complexity Approach for High Throughput Belief-Propagation based Decoding of LDPC Codes

    Directory of Open Access Journals (Sweden)

    BOT, A.

    2013-11-01

    Full Text Available The paper proposes a low complexity belief propagation (BP based decoding algorithm for LDPC codes. In spite of the iterative nature of the decoding process, the proposed algorithm provides both reduced complexity and increased BER performances as compared with the classic min-sum (MS algorithm, generally used for hardware implementations. Linear approximations of check-nodes update function are used in order to reduce the complexity of the BP algorithm. Considering this decoding approach, an FPGA based hardware architecture is proposed for implementing the decoding algorithm, aiming to increase the decoder throughput. FPGA technology was chosen for the LDPC decoder implementation, due to its parallel computation and reconfiguration capabilities. The obtained results show improvements regarding decoding throughput and BER performances compared with state-of-the-art approaches.

  13. Dynamics and Conservation Management of a Wooded Landscape under High Herbivore Pressure

    Directory of Open Access Journals (Sweden)

    Adrian C. Newton

    2013-01-01

    Full Text Available We present the use of a spatially explicit model of woodland dynamics (LANDIS-II to examine the impacts of herbivory in the New Forest National Park, UK, in relation to its management for biodiversity conservation. The model was parameterized using spatial data and the results of two field surveys and then was tested with results from a third survey. Field survey results indicated that regeneration by tree species was found to be widespread but to occur at low density, despite heavy browsing pressure. The model was found to accurately predict the abundance and richness of tree species. Over the duration of the simulations (300 yr, woodland area increased in all scenarios, with or without herbivory. While the increase in woodland area was most pronounced under a scenario of no herbivory, values increased by more than 70% even in the presence of heavy browsing pressure. Model projections provided little evidence for the conversion of woodland areas to either grassland or heathland; changes in woodland structure and composition were consistent with traditional successional theory. These results highlight the need for multiple types of intervention when managing successional landscape mosaics and demonstrate the value of landscape-scale modelling for evaluating the role of herbivory in conservation management.

  14. Proteomic Analysis of Pathogenic Fungi Reveals Highly Expressed Conserved Cell Wall Proteins

    Directory of Open Access Journals (Sweden)

    Jackson Champer

    2016-01-01

    Full Text Available We are presenting a quantitative proteomics tally of the most commonly expressed conserved fungal proteins of the cytosol, the cell wall, and the secretome. It was our goal to identify fungi-typical proteins that do not share significant homology with human proteins. Such fungal proteins are of interest to the development of vaccines or drug targets. Protein samples were derived from 13 fungal species, cultured in rich or in minimal media; these included clinical isolates of Aspergillus, Candida, Mucor, Cryptococcus, and Coccidioides species. Proteomes were analyzed by quantitative MSE (Mass Spectrometry—Elevated Collision Energy. Several thousand proteins were identified and quantified in total across all fractions and culture conditions. The 42 most abundant proteins identified in fungal cell walls or supernatants shared no to very little homology with human proteins. In contrast, all but five of the 50 most abundant cytosolic proteins had human homologs with sequence identity averaging 59%. Proteomic comparisons of the secreted or surface localized fungal proteins highlighted conserved homologs of the Aspergillus fumigatus proteins 1,3-β-glucanosyltransferases (Bgt1, Gel1-4, Crf1, Ecm33, EglC, and others. The fact that Crf1 and Gel1 were previously shown to be promising vaccine candidates, underlines the value of the proteomics data presented here.

  15. A MODIFIED UNEQUAL POWER ALLOCATION (UPA SCHEME FOR PERFORMANCE ENHANCEMENT IN BIT REPETITION TURBO CODES IN HIGH SPEED DOWNLINK PACKET ACCESS (HSDPA SYSTEM

    Directory of Open Access Journals (Sweden)

    B. BALAMURALITHARA

    2015-06-01

    Full Text Available In this paper, a modified optimal power allocation scheme for different bits in turbo encoder has been proposed to improve the performance of Turbo Codes system in High Speed Downlink Packet Access (HSDPA service. In a typical turbo code in HSDPA system, an encoder with code rate of 1/3 was used with bit repetition scheme or puncturing system to achieve code rate of 1/4. In this study, the author has proposed a modified unequal power allocation (UPA scheme to improve the performance of Turbo Codes in HSDPA system. The simulation and performance bound results for the proposed UPA scheme for the frame length of N = 400, code rate = 1/4 with Log-MAP decoder over Additive White Gaussian Noise (AWGN channel were obtained and compared with the typical Turbo Codes systems, which used bit repetition scheme and puncturing method without UPA. From the results, the proposed bit repetition turbo codes system with modified UPA scheme showed better performance than the typical turbo codes system without UPA using bit repetition and puncturing approaches with coding gain of 0.35 dB to 0.56 dB.

  16. Status report on multigroup cross section generation code development for high-fidelity deterministic neutronics simulation system.

    Energy Technology Data Exchange (ETDEWEB)

    Yang, W. S.; Lee, C. H. (Nuclear Engineering Division)

    2008-05-16

    Under the fast reactor simulation program launched in April 2007, development of an advanced multigroup cross section generation code was initiated in July 2007, in conjunction with the development of the high-fidelity deterministic neutron transport code UNIC. The general objectives are to simplify the existing multi-step schemes and to improve the resolved and unresolved resonance treatments. Based on the review results of current methods and the fact that they have been applied successfully to fast critical experiment analyses and fast reactor designs for last three decades, the methodologies of the ETOE-2/MC{sup 2}-2/SDX code system were selected as the starting set of methodologies for multigroup cross section generation for fast reactor analysis. As the first step for coupling with the UNIC code and use in a parallel computing environment, the MC{sup 2}-2 code was updated by modernizing the memory structure and replacing old data management package subroutines and functions with FORTRAN 90 based routines. Various modifications were also made in the ETOE-2 and MC{sup 2}-2 codes to process the ENDF/B-VII.0 data properly. Using the updated ETOE-2/MC{sup 2}-2 code system, the ENDF/B-VII.0 data was successfully processed for major heavy and intermediate nuclides employed in sodium-cooled fast reactors. Initial verification tests of the MC{sup 2}-2 libraries generated from ENDF/B-VII.0 data were performed by inter-comparison of twenty-one group infinite dilute total cross sections obtained from MC{sup 2}-2, VIM, and NJOY. For almost all nuclides considered, MC{sup 2}-2 cross sections agreed very well with those from VIM and NJOY. Preliminary validation tests of the ENDF/B-VII.0 libraries of MC{sup 2}-2 were also performed using a set of sixteen fast critical benchmark problems. The deterministic results based on MC{sup 2}-2/TWODANT calculations were in good agreement with MCNP solutions within {approx}0.25% {Delta}{rho}, except a few small LANL fast assemblies

  17. Validation and Verification of MCNP6 Against Intermediate and High-Energy Experimental Data and Results by Other Codes

    CERN Document Server

    Mashnik, Stepan G

    2010-01-01

    MCNP6, the latest and most advanced LANL transport code representing a recent merger of MCNP5 and MCNPX, has been Validated and Verified (V&V) against a variety of intermediate and high-energy experimental data and against results by different versions of MCNPX and other codes. In the present work, we V&V MCNP6 using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.02 and LAQGSM03.03. We found that MCNP6 describes reasonably well various reactions induced by particles and nuclei at incident energies from 18 MeV to about 1 TeV per nucleon measured on thin and thick targets and agrees very well with similar results obtained with MCNPX and calculations by CEM03.02, LAQGSM03.01 (03.03), INCL4 + ABLA, and Bertini INC + Dresner evaporation, EPAX, ABRABLA, HIPSE, and AMD, used as stand alone codes. Most of several computational bugs and more serious physics problems observed in MCNP6/X during our V...

  18. A Pulsed Coding Technique Based on Optical UWB Modulation for High Data Rate Low Power Wireless Implantable Biotelemetry

    Directory of Open Access Journals (Sweden)

    Andrea De Marcellis

    2016-10-01

    Full Text Available This paper reports on a pulsed coding technique based on optical Ultra-wideband (UWB modulation for wireless implantable biotelemetry systems allowing for high data rate link whilst enabling significant power reduction compared to the state-of-the-art. This optical data coding approach is suitable for emerging biomedical applications like transcutaneous neural wireless communication systems. The overall architecture implementing this optical modulation technique employs sub-nanosecond pulsed laser as the data transmitter and small sensitive area photodiode as the data receiver. Moreover, it includes coding and decoding digital systems, biasing and driving analogue circuits for laser pulse generation and photodiode signal conditioning. The complete system has been implemented on Field-Programmable Gate Array (FPGA and prototype Printed Circuit Board (PCB with discrete off-the-shelf components. By inserting a diffuser between the transmitter and the receiver to emulate skin/tissue, the system is capable to achieve a 128 Mbps data rate with a bit error rate less than 10−9 and an estimated total power consumption of about 5 mW corresponding to a power efficiency of 35.9 pJ/bit. These results could allow, for example, the transmission of an 800-channel neural recording interface sampled at 16 kHz with 10-bit resolution.

  19. A versatile, bar-coded nuclear marker/reporter for live cell fluorescent and multiplexed high content imaging.

    Directory of Open Access Journals (Sweden)

    Irina Krylova

    Full Text Available The screening of large numbers of compounds or siRNAs is a mainstay of both academic and pharmaceutical research. Most screens test those interventions against a single biochemical or cellular output whereas recording multiple complementary outputs may be more biologically relevant. High throughput, multi-channel fluorescence microscopy permits multiple outputs to be quantified in specific cellular subcompartments. However, the number of distinct fluorescent outputs available remains limited. Here, we describe a cellular bar-code technology in which multiple cell-based assays are combined in one well after which each assay is distinguished by fluorescence microscopy. The technology uses the unique fluorescent properties of assay-specific markers comprised of distinct combinations of different 'red' fluorescent proteins sandwiched around a nuclear localization signal. The bar-code markers are excited by a common wavelength of light but distinguished ratiometrically by their differing relative fluorescence in two emission channels. Targeting the bar-code to cell nuclei enables individual cells expressing distinguishable markers to be readily separated by standard image analysis programs. We validated the method by showing that the unique responses of different cell-based assays to specific drugs are retained when three assays are co-plated and separated by the bar-code. Based upon those studies, we discuss a roadmap in which even more assays may be combined in a well. The ability to analyze multiple assays simultaneously will enable screens that better identify, characterize and distinguish hits according to multiple biologically or clinically relevant criteria. These capabilities also enable the re-creation of complex mixtures of cell types that is emerging as a central area of interest in many fields.

  20. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...

  1. Impacts of Tropical Forest Disturbance Upon Avifauna on a Small Island with High Endemism: Implications for Conservation

    Directory of Open Access Journals (Sweden)

    Martin Thomas

    2010-01-01

    Full Text Available Tropical forests are rapidly being lost across Southeast Asia and this is predicted to have severe implications for many of the region′s bird species. However, relationships between forest disturbance and avifaunal assemblages remain poorly understood, particularly on small island ecosystems such as those found in the biodiversity ′hotspot′ of Wallacea. This study examines how avifaunal richness varies across a disturbance gradient in a forest reserve on Buton Island, southeast Sulawesi. Particular emphasis is placed upon examining responses in endemic and red-listed species with high conservation importance. Results indicate that overall avian richness increases between primary and 30-year-old regenerating secondary forest and then decreases through disturbed secondary forest, but is highest in cleared farmland. However, high species richness in farmland does not signify high species distinctiveness; bird community composition here differs significantly from that found in forest sites, and is poor in supporting forest specialists and endemic species. Certain large-bodied endemics such as the Knobbed Hornbill (Rhyticeros cassidix appear to be sensitive to moderate disturbance, with populations occurring at greatest density within primary forest. However, overall endemic species richness, as well as that of endemic frugivores and insectivores, is similar in primary and secondary forest types. Results indicate that well-established secondary forest in particular has an important role in supporting species with high conservational importance, possessing community composition similar to that found in primary forest and supporting an equally high richness of endemic species.

  2. Microparticles: Facile and High-Throughput Synthesis of Functional Microparticles with Quick Response Codes (Small 24/2016).

    Science.gov (United States)

    Ramirez, Lisa Marie S; He, Muhan; Mailloux, Shay; George, Justin; Wang, Jun

    2016-06-01

    Microparticles carrying quick response (QR) barcodes are fabricated by J. Wang and co-workers on page 3259, using a massive coding of dissociated elements (MiCODE) technology. Each microparticle can bear a special custom-designed QR code that enables encryption or tagging with unlimited multiplexity, and the QR code can be easily read by cellphone applications. The utility of MiCODE particles in multiplexed DNA detection and microtagging for anti-counterfeiting is explored.

  3. Universal antibodies against the highly conserved influenza fusion peptide cross-neutralize several subtypes of influenza A virus

    Energy Technology Data Exchange (ETDEWEB)

    Hashem, Anwar M. [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Department of Microbiology, Faculty of Medicine, King Abdulaziz University, Jeddah (Saudi Arabia); Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON (Canada); Van Domselaar, Gary [National Microbiology Laboratory, Public Health Agency of Canada, Winnipeg, MB (Canada); Li, Changgui; Wang, Junzhi [National Institute for the Control of Pharmaceutical and Biological Products, Beijing (China); She, Yi-Min; Cyr, Terry D. [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Sui, Jianhua [Department of Cancer Immunology and AIDS, Dana-Farber Cancer Institute, Department of Medicine, Harvard Medical School, 44 Binney Street, Boston, MA 02115 (United States); He, Runtao [National Microbiology Laboratory, Public Health Agency of Canada, Winnipeg, MB (Canada); Marasco, Wayne A. [Department of Cancer Immunology and AIDS, Dana-Farber Cancer Institute, Department of Medicine, Harvard Medical School, 44 Binney Street, Boston, MA 02115 (United States); Li, Xuguang, E-mail: Sean.Li@hc-sc.gc.ca [Centre for Vaccine Evaluation, Biologics and Genetic Therapies Directorate, HPFB, Health Canada, Ottawa, ON (Canada); Department of Biochemistry, Microbiology and Immunology, University of Ottawa, Ottawa, ON (Canada)

    2010-12-10

    Research highlights: {yields} The fusion peptide is the only universally conserved epitope in all influenza viral hemagglutinins. {yields} Anti-fusion peptide antibodies are universal antibodies that cross-react with all influenza HA subtypes. {yields} The universal antibodies cross-neutralize different influenza A subtypes. {yields} The universal antibodies inhibit the fusion process between the viruses and the target cells. -- Abstract: The fusion peptide of influenza viral hemagglutinin plays a critical role in virus entry by facilitating membrane fusion between the virus and target cells. As the fusion peptide is the only universally conserved epitope in all influenza A and B viruses, it could be an attractive target for vaccine-induced immune responses. We previously reported that antibodies targeting the first 14 amino acids of the N-terminus of the fusion peptide could bind to virtually all influenza virus strains and quantify hemagglutinins in vaccines produced in embryonated eggs. Here we demonstrate that these universal antibodies bind to the viral hemagglutinins in native conformation presented in infected mammalian cell cultures and neutralize multiple subtypes of virus by inhibiting the pH-dependant fusion of viral and cellular membranes. These results suggest that this unique, highly-conserved linear sequence in viral hemagglutinin is exposed sufficiently to be attacked by the antibodies during the course of infection and merits further investigation because of potential importance in the protection against diverse strains of influenza viruses.

  4. Fluctuations of Conserved Quantities in High Energy Nuclear Collisions at RHIC

    CERN Document Server

    Luo, Xiaofeng

    2015-01-01

    Fluctuations of conserved quantities in heavy-ion collisions are used to probe the phase transition and the QCD critical point for the strongly interacting hot and dense nuclear matter. The STAR experiment has carried out moment analysis of net-proton (proxy for net-baryon (B)), net-kaon (proxy for net-strangeness (S)), and net-charge (Q). These measurements are important for understanding the quantum chromodynamics phase diagram. We present the analysis techniques used in the moment analysis by the STAR experiment and discuss the moments of net-proton and net-charge distributions from the first phase of the Beam Energy Scan program at the Relativistic Heavy Ion Collider.

  5. A highly conserved novel family of mammalian developmental transcription factors related to Drosophila grainyhead.

    Science.gov (United States)

    Wilanowski, Tomasz; Tuckfield, Annabel; Cerruti, Loretta; O'Connell, Sinead; Saint, Robert; Parekh, Vishwas; Tao, Jianning; Cunningham, John M; Jane, Stephen M

    2002-06-01

    The Drosophila transcription factor Grainyhead regulates several key developmental processes. Three mammalian genes, CP2, LBP-1a and LBP-9 have been previously identified as homologues of grainyhead. We now report the cloning of two new mammalian genes (Mammalian grainyhead (MGR) and Brother-of-MGR (BOM)) and one new Drosophila gene (dCP2) that rewrite the phylogeny of this family. We demonstrate that MGR and BOM are more closely related to grh, whereas CP2, LBP-1a and LBP-9 are descendants of the dCP2 gene. MGR shares the greatest sequence homology with grh, is expressed in tissue-restricted patterns more comparable to grh and binds to and transactivates the promoter of the human Engrailed-1 gene, the mammalian homologue of the key grainyhead target gene, engrailed. This sequence and functional conservation indicates that the new mammalian members of this family play important developmental roles.

  6. Fluctuations of Conserved Quantities in High Energy Nuclear Collisions at RHIC

    Science.gov (United States)

    Luo, Xiaofeng

    2015-04-01

    Fluctuations of conserved quantities in heavy-ion collisions are used to probe the phase transition and the QCD critical point for the strongly interacting hot and dense nuclear matter. The STAR experiment has carried out moment analysis of net-proton (proxy for net- baryon (B)), net-kaon (proxy for net-strangeness (S)), and net-charge (Q). These measurements are important for understanding the quantum chromodynamics phase diagram. We present the analysis techniques used in the moment analysis by the STAR experiment and discuss the moments of net-proton and net-charge distributions from the first phase of the Beam Energy Scan program at the Relativistic Heavy Ion Collider.

  7. HyRec: A fast and highly accurate primordial hydrogen and helium recombination code

    CERN Document Server

    Ali-Haïmoud, Yacine

    2010-01-01

    We present a state-of-the-art primordial recombination code, HyRec, including all the physical effects that have been shown to significantly affect recombination. The computation of helium recombination includes simple analytic treatments of hydrogen continuum opacity in the He I 2 1P - 1 1S line, the He I] 2 3P - 1 1S line, and treats feedback between these lines within the on-the-spot approximation. Hydrogen recombination is computed using the effective multilevel atom method, virtually accounting for an infinite number of excited states. We account for two-photon transitions from 2s and higher levels as well as frequency diffusion in Lyman-alpha with a full radiative transfer calculation. We present a new method to evolve the radiation field simultaneously with the level populations and the free electron fraction. These computations are sped up by taking advantage of the particular sparseness pattern of the equations describing the radiative transfer. The computation time for a full recombination history i...

  8. Turbulence statistics in a spectral element code: a toolbox for High-Fidelity Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Vinuesa, Ricardo [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden); Fick, Lambert [Argonne National Lab. (ANL), Argonne, IL (United States); Negi, Prabal [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden); Marin, Oana [Argonne National Lab. (ANL), Argonne, IL (United States); Merzari, Elia [Argonne National Lab. (ANL), Argonne, IL (United States); Schlatter, Phillip [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden)

    2017-02-01

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with Lx = 2h, Ly = 2h and Lz = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the mesh is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).

  9. A highly conserved NF-κB-responsive enhancer is critical for thymic expression of Aire in mice.

    Science.gov (United States)

    Haljasorg, Uku; Bichele, Rudolf; Saare, Mario; Guha, Mithu; Maslovskaja, Julia; Kõnd, Karin; Remm, Anu; Pihlap, Maire; Tomson, Laura; Kisand, Kai; Laan, Martti; Peterson, Pärt

    2015-12-01

    Autoimmune regulator (Aire) has a unique expression pattern in thymic medullary epithelial cells (mTECs), in which it plays a critical role in the activation of tissue-specific antigens. The expression of Aire in mTECs is activated by receptor activator of nuclear factor κB (RANK) signaling; however, the molecular mechanism behind this activation is unknown. Here, we characterize a conserved noncoding sequence 1 (CNS1) containing two NF-κB binding sites upstream of the Aire coding region. We show that CNS1-deficient mice lack thymic expression of Aire and share several features of Aire-knockout mice, including downregulation of Aire-dependent genes, impaired terminal differentiation of the mTEC population, and reduced production of thymic Treg cells. In addition, we show that CNS1 is indispensable for RANK-induced Aire expression and that CNS1 is activated by NF-κB pathway complexes containing RelA. Together, our results indicate that CNS1 is a critical link between RANK signaling, NF-κB activation, and thymic expression of Aire. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); Xu, Guang-Hua [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2015-03-10

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n{sup n} with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method.

  11. PhyloCSF: a comparative genomics method to distinguish protein coding and non-coding regions.

    Science.gov (United States)

    Lin, Michael F; Jungreis, Irwin; Kellis, Manolis

    2011-07-01

    As high-throughput transcriptome sequencing provides evidence for novel transcripts in many species, there is a renewed need for accurate methods to classify small genomic regions as protein coding or non-coding. We present PhyloCSF, a novel comparative genomics method that analyzes a multispecies nucleotide sequence alignment to determine whether it is likely to represent a conserved protein-coding region, based on a formal statistical comparison of phylogenetic codon models. We show that PhyloCSF's classification performance in 12-species Drosophila genome alignments exceeds all other methods we compared in a previous study. We anticipate that this method will be widely applicable as the transcriptomes of many additional species, tissues and subcellular compartments are sequenced, particularly in the context of ENCODE and modENCODE, and as interest grows in long non-coding RNAs, often initially recognized by their lack of protein coding potential rather than conserved RNA secondary structures. The Objective Caml source code and executables for GNU/Linux and Mac OS X are freely available at http://compbio.mit.edu/PhyloCSF CONTACT: mlin@mit.edu; manoli@mit.edu.

  12. An adjoint method for a high-order discretization of deforming domain conservation laws for optimization of flow problems

    Science.gov (United States)

    Zahr, M. J.; Persson, P.-O.

    2016-12-01

    The fully discrete adjoint equations and the corresponding adjoint method are derived for a globally high-order accurate discretization of conservation laws on parametrized, deforming domains. The conservation law on the deforming domain is transformed into one on a fixed reference domain by the introduction of a time-dependent mapping that encapsulates the domain deformation and parametrization, resulting in an Arbitrary Lagrangian-Eulerian form of the governing equations. A high-order discontinuous Galerkin method is used to discretize the transformed equation in space and a high-order diagonally implicit Runge-Kutta scheme is used for the temporal discretization. Quantities of interest that take the form of space-time integrals are discretized in a solver-consistent manner. The corresponding fully discrete adjoint method is used to compute exact gradients of quantities of interest along the manifold of solutions of the fully discrete conservation law. These quantities of interest and their gradients are used in the context of gradient-based PDE-constrained optimization. The adjoint method is used to solve two optimal shape and control problems governed by the isentropic, compressible Navier-Stokes equations. The first optimization problem seeks the energetically optimal trajectory of a 2D airfoil given a required initial and final spatial position. The optimization solver, driven by gradients computed via the adjoint method, reduced the total energy required to complete the specified mission nearly an order of magnitude. The second optimization problem seeks the energetically optimal flapping motion and time-morphed geometry of a 2D airfoil given an equality constraint on the x-directed impulse generated on the airfoil. The optimization solver satisfied the impulse constraint to greater than 8 digits of accuracy and reduced the required energy between a factor of 2 and 10, depending on the value of the impulse constraint, as compared to the nominal configuration.

  13. The 'Brick Wall' radio loss approximation and the performance of strong channel codes for deep space applications at high data rates

    Science.gov (United States)

    Shambayati, Shervin

    2001-01-01

    In order to evaluate performance of strong channel codes in presence of imperfect carrier phase tracking for residual carrier BPSK modulation in this paper an approximate 'brick wall' model is developed which is independent of the channel code type for high data rates. It is shown that this approximation is reasonably accurate (less than 0.7dB for low FERs for (1784,1/6) code and less than 0.35dB for low FERs for (5920,1/6) code). Based on the approximation's accuracy, it is concluded that the effects of imperfect carrier tracking are more or less independent of the channel code type for strong channel codes. Therefore, the advantage that one strong channel code has over another with perfect carrier tracking translates to nearly the same advantage under imperfect carrier tracking conditions. This will allow the link designers to incorporate projected channel code performance of strong channel codes into their design tables without worrying about their behavior in the face of imperfect carrier phase tracking.

  14. Average gene length is highly conserved in prokaryotes and eukaryotes and diverges only between the two kingdoms.

    Science.gov (United States)

    Xu, Lin; Chen, Hong; Hu, Xiaohua; Zhang, Rongmei; Zhang, Ze; Luo, Z W

    2006-06-01

    The average length of genes in a eukaryote is larger than in a prokaryote, implying that evolution of complexity is related to change of gene lengths. Here, we show that although the average lengths of genes in prokaryotes and eukaryotes are much different, the average lengths of genes are highly conserved within either of the two kingdoms. This suggests that natural selection has clearly set a strong limitation on gene elongation within the kingdom. Furthermore, the average gene size adds another distinct characteristic for the discrimination between the two kingdoms of organisms.

  15. On optimal control problem for conservation law modelling one class of highly re-entrant production systems

    Science.gov (United States)

    D'Apice, Ciro; Kogut, Peter I.

    2017-07-01

    We discuss the optimal control problem stated as the minimization in the L2-sense of the mismatch between the actual out-flux and a demand forecast for a hyperbolic conservation law that models a highly re-entrant production system. The output of the factory is described as a function of the work in progress and the position of the so-called push-pull point (PPP) where we separate the beginning of the factory employing a push policy from the end of the factory, which uses a pull policy.

  16. Hearing sensitivity in context: Conservation implications for a highly vocal endangered species

    Directory of Open Access Journals (Sweden)

    Megan A. Owen

    2016-04-01

    Full Text Available Hearing sensitivity is a fundamental determinant of a species’ vulnerability to anthropogenic noise, however little is known about the hearing capacities of most conservation dependent species. When audiometric data are integrated with other aspects of species’ acoustic ecology, life history, and characteristic habitat topography and soundscape, predictions can be made regarding probable vulnerability to the negative impacts of different types of anthropogenic noise. Here we used an adaptive psychoacoustic technique to measure hearing thresholds in the endangered giant panda; a species that uses acoustic communication to coordinate reproduction. Our results suggest that giant pandas have functional hearing into the ultrasonic range, with good sensitivity between 10.0 and 16.0 kHz, and best sensitivity measured at 12.5–14.0 kHz. We estimated the lower and upper limits of functional hearing as 0.10 and 70.0 kHz respectively. While these results suggest that panda hearing is similar to that of some other terrestrial carnivores, panda hearing thresholds above 14.0 kHz were significantly lower (i.e., more sensitive than those of the polar bear, the only other bear species for which data are available. We discuss the implications of this divergence, as well as the relationship between hearing sensitivity and the spectral parameters of panda vocalizations. We suggest that these data, placed in context, can be used towards the development of a sensory-based model of noise disturbance for the species.

  17. On conservative models of "the pair-production anomaly" in blazar spectra at Very High Energies

    CERN Document Server

    Dzhatdoev, T A

    2015-01-01

    For some blazars, the gamma-ray absorption features due to pair-production on the Extragalactic Background Light (EBL) are fainter than expected. The present work reviews the main models that could explain this paradox, with emphasis on conservative ones, that do not include any new physics. The models that are intrinsic to the source, do allow a very hard primary spectrum, but fail to explain a regular redshift dependence of the anomaly starting energy. The model that includes a contribution from secondary photons produced by cosmic rays (CR) near the Earth seems to require a well collimated CR beam, what is hard to achieve. Finally, the model with secondary photons produced in electromagnetic (EM) cascades initiated by primary gamma-rays is considered. In principle, it allows to decrease the statistical significance of the anomaly and, while requiring quite low EGMF strength B, does not contradict to most contemporary constraints on the B value. Additionally, it is shown that the recently observed correlati...

  18. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Science.gov (United States)

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  19. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Directory of Open Access Journals (Sweden)

    Behrang Barekatain

    Full Text Available In recent years, Random Network Coding (RNC has emerged as a promising solution for efficient Peer-to-Peer (P2P video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  20. Simple runtime high energy photon emission for ultra relativistic laser-plasma interaction in a PIC-code

    CERN Document Server

    Wallin, Erik; Marklund, Mattias

    2014-01-01

    We model the emission of high energy photons due to relativistic particles in a plasma interacting with a super-intense laser. This is done in a particle-in-cell code where the high frequency radiation normally cannot be resolved, due to the unattainable demands it would place on the time and space resolution. A simple expression for the synchrotron radiation spectra is used together with a Monte-Carlo method for the emittance. We extend to previous work by accounting acceleration due to arbitrary fields, considering the particles to be in instantaneous circular motion due to an effective magnetic field. Furthermore we implement noise reduction techniques and present estimations of the validity of the method. Finally we perform a rigorous comparison to the mechanism of radiation reaction, with the emitted energy very well in agreement with the radiation reaction loss.

  1. Sequence and expression pattern of pax-6 are highly conserved between zebrafish and mice.

    Science.gov (United States)

    Püschel, A W; Gruss, P; Westerfield, M

    1992-03-01

    Despite obvious differences in the patterns of early embryonic development, vertebrates share a number of developmental mechanisms and control genes, suggesting that they use similar genetic programs at some stages of development. To examine this idea, we isolated and characterized one such gene, pax-6, a member of the pax gene family, from the zebrafish Brachydanio rerio and determined the evolutionary conservation in the structure and expression of this gene by comparison to its homolog in mice. We found two alternatively spliced forms of the zebrafish pax-6 message. Sequence and expression pattern of the zebrafish pax-6 gene are remarkably similar to its murine homolog. pax-6 expression begins during early neurulation. A stripe of cells in the neuroectoderm, including the prospective diencephalon and a part of the telencephalon, expresses pax-6 as well as the hindbrain and the ventral spinal cord extending from the level of the first rhombomere to the posterior end of the CNS. During later development more limited regions of the brain including the eye, the olfactory bulb and the pituitary gland express pax-6. Cells at the midbrain-hindbrain junction express eng genes and are separated from the neighboring pax-6 regions by several cells that express neither gene, indicating a complex subdivision of this region. pax-6 expression appears during processes when cell-to-cell signalling is thought to be important, for example during induction of the eye and regionalization of the spinal cord and brain, suggesting that it may be one component mediating the response to inductive interactions.

  2. Mechanisms regulating GLUT4 transcription in skeletal muscle cells are highly conserved across vertebrates.

    Science.gov (United States)

    Marín-Juez, Rubén; Diaz, Mónica; Morata, Jordi; Planas, Josep V

    2013-01-01

    The glucose transporter 4 (GLUT4) plays a key role in glucose uptake in insulin target tissues. This transporter has been extensively studied in many species in terms of its function, expression and cellular traffic and complex mechanisms are involved in its regulation at many different levels. However, studies investigating the transcription of the GLUT4 gene and its regulation are scarce. In this study, we have identified the GLUT4 gene in a teleost fish, the Fugu (Takifugu rubripes), and have cloned and characterized a functional promoter of this gene for the first time in a non-mammalian vertebrate. In silico analysis of the Fugu GLUT4 promoter identified potential binding sites for transcription factors such as SP1, C/EBP, MEF2, KLF, SREBP-1c and GC-boxes, as well as a CpG island, but failed to identify a TATA box. In vitro analysis revealed three transcription start sites, with the main residing 307 bp upstream of the ATG codon. Deletion analysis determined that the core promoter was located between nucleotides -132/+94. By transfecting a variety of 5´deletion constructs into L6 muscle cells we have determined that Fugu GLUT4 promoter transcription is regulated by insulin, PG-J2, a PPARγ agonist, and electrical pulse stimulation. Furthermore, our results suggest the implication of motifs such as PPARγ/RXR and HIF-1α in the regulation of Fugu GLUT4 promoter activity by PPARγ and contractile activity, respectively. These data suggest that the characteristics and regulation of the GLUT4 promoter have been remarkably conserved during the evolution from fish to mammals, further evidencing the important role of GLUT4 in metabolic regulation in vertebrates.

  3. Conserved host response to highly pathogenic avian influenza virus infection in human cell culture, mouse and macaque model systems

    Directory of Open Access Journals (Sweden)

    McDermott Jason E

    2011-11-01

    Full Text Available Abstract Background Understanding host response to influenza virus infection will facilitate development of better diagnoses and therapeutic interventions. Several different experimental models have been used as a proxy for human infection, including cell cultures derived from human cells, mice, and non-human primates. Each of these systems has been studied extensively in isolation, but little effort has been directed toward systematically characterizing the conservation of host response on a global level beyond known immune signaling cascades. Results In the present study, we employed a multivariate modeling approach to characterize and compare the transcriptional regulatory networks between these three model systems after infection with a highly pathogenic avian influenza virus of the H5N1 subtype. Using this approach we identified functions and pathways that display similar behavior and/or regulation including the well-studied impact on the interferon response and the inflammasome. Our results also suggest a primary response role for airway epithelial cells in initiating hypercytokinemia, which is thought to contribute to the pathogenesis of H5N1 viruses. We further demonstrate that we can use a transcriptional regulatory model from the human cell culture data to make highly accurate predictions about the behavior of important components of the innate immune system in tissues from whole organisms. Conclusions This is the first demonstration of a global regulatory network modeling conserved host response between in vitro and in vivo models.

  4. Landscape genetics informs mesohabitat preference and conservation priorities for a surrogate indicator species in a highly fragmented river system.

    Science.gov (United States)

    Lean, J; Hammer, M P; Unmack, P J; Adams, M; Beheregaray, L B

    2017-04-01

    Poor dispersal species represent conservative benchmarks for biodiversity management because they provide insights into ecological processes influenced by habitat fragmentation that are less evident in more dispersive organisms. Here we used the poorly dispersive and threatened river blackfish (Gadopsis marmoratus) as a surrogate indicator system for assessing the effects of fragmentation in highly modified river basins and for prioritizing basin-wide management strategies. We combined individual, population and landscape-based approaches to analyze genetic variation in samples spanning the distribution of the species in Australia's Murray-Darling Basin, one of the world's most degraded freshwater systems. Our results indicate that G. marmoratus displays the hallmark of severe habitat fragmentation with notably scattered, small and demographically isolated populations with very low genetic diversity-a pattern found not only between regions and catchments but also between streams within catchments. By using hierarchically nested population sampling and assessing relationships between genetic uniqueness and genetic diversity across populations, we developed a spatial management framework that includes the selection of populations in need of genetic rescue. Landscape genetics provided an environmental criterion to identify associations between landscape features and ecological processes. Our results further our understanding of the impact that habitat quality and quantity has on habitat specialists with similarly low dispersal. They should also have practical applications for prioritizing both large- and small-scale conservation management actions for organisms inhabiting highly fragmented ecosystems.

  5. Evaluation of the Knowledge Level of High School Students in the João Pessoa City with Visual Impairment About Unified Chemistry and Mathematics Braille Code

    Directory of Open Access Journals (Sweden)

    João Batista M. de Resende Filho

    2013-06-01

    Full Text Available The Braille Code is very important in the educational process of students with visual impairment, because this writing system allows which the students know the systematic and organized writing knowledge. These students need to know the symbols and the use’s standards of them, so they can write, read and understand the contents. The Unified Chemistry and Mathematics Braille Code are fundamentally important for these students to understand the texts about these themes. This study aimed to evaluation of the knowledge level of High School students with visual impairment in the João Pessoa city about Unified Chemistry and Mathematics Braille Code. The students showed a relatively good knowledge of symbols and rules of the Unified Mathematics Braille Code, presenting, in the most cases, difficulties in recognizing unusual symbols. The student’s main difficulties were related to Unified Chemistry Braille Code, because they did not know many symbols and writing rules, presenting a low knowledge level.

  6. Dress Codes Blues: An Exploration of Urban Students' Reactions to a Public High School Uniform Policy

    Science.gov (United States)

    DaCosta, Kneia

    2006-01-01

    This qualitative investigation explores the responses of 22 U.S. urban public high school students when confronted with their newly imposed school uniform policy. Specifically, the study assessed students' appraisals of the policy along with compliance and academic performance. Guided by ecological human development perspectives and grounded in…

  7. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  8. GPU Implementation of Two-Dimensional Rayleigh-Benard Code with High Resolution and Extremely High Rayleigh Number

    Science.gov (United States)

    Gonzalez, C. M.; Sanchez, D. A.; Yuen, D. A.; Wright, G. B.; Barnett, G. A.

    2010-12-01

    As computational modeling became prolific throughout the physical sciences community, newer and more efficient ways of processing large amounts of data needed to be devised. One particular method for processing such large amounts of data arose in the form of using a graphics processing unit (GPU) for calculations. Computational scientists were attracted to the GPU as a computational tool as the performance, growth, and availability of GPUs over the past decade increased. Scientists began to utilize the GPU as the sole workhorse for their brute force calculations and modeling. The GPUs, however, were not originally designed for this style of use. As a result, difficulty arose when trying to find a use for the GPU from a scientific standpoint. A lack of parallel programming routines was the main culprit behind the difficulty in programming with a GPU, but with time and a rise in popularity, NVIDIA released a proprietary architecture named Fermi. The Fermi architecture, when used in conjunction with development tools such as CUDA, allowed the programmer easier access to routines that made parallel programming with the NVIDIA GPUs an ease. This new architecture enabled the programmer full access to faster memory, double-precision support, and large amounts of global memory at their fingertips. Our model was based on using a second-order, spatially correct finite difference method and a third order Runge-Kutta time-stepping scheme for studying the 2D Rayleigh-Benard code. The code extensively used the CUBLAS routines to do the heavy linear algebra calculations. The calculations themselves were completed using a single GPU, the NVDIA C2070 Fermi, which boasts 6 GB of global memory. The overall scientific goal of our work was to apply the Tesla C2070's computing potential to achieve an onset of flow reversals as a function of increasing large Rayleigh numbers. Previous investigations were successful using a smaller grid size of 1000x1999 and a Rayleigh number of 10^9. The

  9. Proteome-wide mapping of the Drosophila acetylome demonstrates a high degree of conservation of lysine acetylation

    DEFF Research Database (Denmark)

    Weinert, Brian T; Wagner, Sebastian A; Horn, Heiko

    2011-01-01

    were significantly more conserved than were nonacetylated lysines. Bioinformatics analysis using Gene Ontology terms suggested that the proteins with conserved acetylation control cellular processes such as protein translation, protein folding, DNA packaging, and mitochondrial metabolism. We found...

  10. Parameter analysis for a high-gain harmonic generation FEL using a recently developed 3D polychromatic code

    CERN Document Server

    Biedron, S G; Yu, L H

    2000-01-01

    One possible design for a fourth-generation light source is the high-gain harmonic generation (HGHG) free-electron laser (FEL). Here, a coherent seed with a wavelength at a subharmonic of the desired output radiation interacts with the electron beam in an energy-modulating section. This energy modulation is then converted into spatial bunching while traversing a dispersive section (a three-dipole chicane). The final step is passage through an undulator tuned to the desired higher harmonic output wavelength. The coherent seed serves to suppress and can be at a much lower subharmonic of the output radiation. Recently, a 3D code that includes multiple frequencies, multiple undulators (both in quantity and/or type), quadrupole magnets, and dipole magnets was developed to easily simulate HGHG. Here, a brief review of the HGHG theory, the code development, the Accelerator Test Facility's (ATF) HGHG FEL experimental parameters, and the parameter analysis from simulations of this specific experiment will be discussed...

  11. Phase-coded multi-pulse technique for ultrasonic high-order harmonic imaging of biological tissues in vitro.

    Science.gov (United States)

    Ma, Qingyu; Zhang, Dong; Gong, Xiufen; Ma, Yong

    2007-04-07

    Second or higher order harmonic imaging shows significant improvement in image clarity but is degraded by low signal-noise ratio (SNR) compared with fundamental imaging. This paper presents a phase-coded multi-pulse technique to provide the enhancement of SNR for the desired high-order harmonic ultrasonic imaging. In this technique, with N phase-coded pulses excitation, the received Nth harmonic signal is enhanced by 20 log(10)N dB compared with that in the single-pulse mode, whereas the fundamental and other order harmonic components are efficiently suppressed to reduce image confusion. The principle of this technique is theoretically discussed based on the theory of the finite amplitude sound waves, and examined by measurements of the axial and lateral beam profiles as well as the phase shift of the harmonics. In the experimental imaging for two biological tissue specimens, a plane piston source at 2 MHz is used to transmit a sequence of multiple pulses with equidistant phase shift. The second to fifth harmonic images are obtained using this technique with N = 2 to 5, and compared with the images obtained at the fundamental frequency. Results demonstrate that this technique of relying on higher order harmonics seems to provide a better resolution and contrast of ultrasonic images.

  12. Highly conserved asparagine 82 controls the interaction of Na+ with the sodium-coupled neutral amino acid transporter SNAT2.

    Science.gov (United States)

    Zhang, Zhou; Gameiro, Armanda; Grewer, Christof

    2008-05-01

    The neutral amino acid transporter 2 (SNAT2), which belongs to the SLC38 family of solute transporters, couples the transport of amino acid to the cotransport of one Na(+) ion into the cell. Several polar amino acids are highly conserved within the SLC38 family. Here, we mutated three of these conserved amino acids, Asn(82) in the predicted transmembrane domain 1 (TMD1), Tyr(337) in TMD7, and Arg(374) in TMD8; and we studied the functional consequences of these modifications. The mutation of N82A virtually eliminated the alanine-induced transport current, as well as amino acid uptake by SNAT2. In contrast, the mutations Y337A and R374Q did not abolish amino acid transport. The K(m) of SNAT2 for its interaction with Na(+), K(Na(+)), was dramatically reduced by the N82A mutation, whereas the more conservative mutation N82S resulted in a K(Na(+)) that was in between SNAT2(N82A) and SNAT2(WT). These results were interpreted as a reduction of Na(+) affinity caused by the Asn(82) mutations, suggesting that these mutations interfere with the interaction of SNAT2 with the sodium ion. As a consequence of this dramatic reduction in Na(+) affinity, the apparent K(m) of SNAT2(N82A) for alanine was increased 27-fold compared with that of SNAT2(WT). Our results demonstrate a direct or indirect involvement of Asn(82) in Na(+) coordination by SNAT2. Therefore, we predict that TMD1 is crucial for the function of SLC38 transporters and that of related families.

  13. Integration of carbon conservation into sustainable forest management using high resolution satellite imagery: A case study in Sabah, Malaysian Borneo

    Science.gov (United States)

    Langner, Andreas; Samejima, Hiromitsu; Ong, Robert C.; Titin, Jupiri; Kitayama, Kanehiro

    2012-08-01

    Conservation of tropical forests is of outstanding importance for mitigation of climate change effects and preserving biodiversity. In Borneo most of the forests are classified as permanent forest estates and are selectively logged using conventional logging techniques causing high damage to the forest ecosystems. Incorporation of sustainable forest management into climate change mitigation measures such as Reducing Emissions from Deforestation and Forest Degradation (REDD+) can help to avert further forest degradation by synergizing sustainable timber production with the conservation of biodiversity. In order to evaluate the efficiency of such initiatives, monitoring methods for forest degradation and above-ground biomass in tropical forests are urgently needed. In this study we developed an index using Landsat satellite data to describe the crown cover condition of lowland mixed dipterocarp forests. We showed that this index combined with field data can be used to estimate above-ground biomass using a regression model in two permanent forest estates in Sabah, Malaysian Borneo. Tangkulap represented a conventionally logged forest estate while Deramakot has been managed in accordance with sustainable forestry principles. The results revealed that conventional logging techniques used in Tangkulap during 1991 and 2000 decreased the above-ground biomass by an annual amount of average -6.0 t C/ha (-5.2 to -7.0 t C/ha, 95% confidential interval) whereas the biomass in Deramakot increased by 6.1 t C/ha per year (5.3-7.2 t C/ha, 95% confidential interval) between 2000 and 2007 while under sustainable forest management. This indicates that sustainable forest management with reduced-impact logging helps to protect above-ground biomass. In absolute terms, a conservative amount of 10.5 t C/ha per year, as documented using the methodology developed in this study, can be attributed to the different management systems, which will be of interest when implementing REDD+ that

  14. Implementation of Energy Code Controls Requirements in New Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hart, Philip R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hatten, Mike [Solarc Energy Group, LLC, Seattle, WA (United States); Jones, Dennis [Group 14 Engineering, Inc., Denver, CO (United States); Cooper, Matthew [Group 14 Engineering, Inc., Denver, CO (United States)

    2017-03-24

    Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research is to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.

  15. Calculations of differential spacecraft charging in high and low Earth orbits using COULOMB-2 code

    Science.gov (United States)

    Novikov, Lev; Makletsov, Andrei; Sinolits, Vadim

    2016-07-01

    In the paper, we discuss the main physical quantities determining the principle features of spacecraft charging in high and low Earth orbits: characteristic values of magnetosphere plasma particle primary currents, peculiarities of the various particle current angular distributions, typical values of secondary emission currents for a number of spacecraft constructional materials. Methods for computation of electrostatic potential distribution over the spacecraft non-uniform complex shape surface which are used in COULOMB-2 program package for high (GEO) and low orbits (LEO) are described. The physical approximations necessary for calculation of the plasma particles primary currents which enable to use the analytical expressions in the case of high spacecraft surface charging similar to formulas for Langmuir currents, are discussed for GEO and for LEO. Distribution of the electrostatic potential over the spacecraft surface is determined as result of numerical solution of nonlinear algebraic equations system corresponding to the established balance of currents on each of discrete elements (2-5 thousands of elements) of the spacecraft surface. The analytical approach noted above enable to obtain the stationary distribution of the potential for rather small computation time that enables to obtain the results for a large number of the influencing factors orientations in reasonable computation time. Typical electric potential distributions over surfaces of the modern GEO and LEO spacecraft are presented. The principle features of these potential distributions determined by specific conditions of charging in GEO and in LEO are discussed.

  16. Identifying putative breast cancer-associated long intergenic non-coding RNA loci by high density SNP array analysis

    Directory of Open Access Journals (Sweden)

    Zhengyu eJiang

    2012-12-01

    Full Text Available Recent high-throughput transcript discoveries have yielded a growing recognition of long intergenic non-coding RNAs (lincRNAs, a class of arbitrarily defined transcripts (>200 nt that are primarily produced from the intergenic space. LincRNAs have been increasingly acknowledged for their expressional dynamics and likely functional associations with cancers. However, differential gene dosage of lincRNA genes between cancer genomes is less studied. By using the high-density Human Omni5-Quad BeadChips (Illumina, we investigated genomic copy number aberrations in a set of seven tumor-normal paired primary human mammary epithelial cells (HMECs established from patients with invasive ductal carcinoma. This Beadchip platform includes a total of 2,435,915 SNP loci dispersed at an average interval of ~700 nt throughout the intergenic region of the human genome. We mapped annotated or putative lincRNA genes to a subset of 332,539 SNP loci, which were included in our analysis for lincRNA-associated copy number variations (CNV. We have identified 122 lincRNAs, which were affected by somatic CNV with overlapped aberrations ranging from 0.14% to 100% in length. LincRNA-associated aberrations were detected predominantly with copy number losses and preferential clustering to the ends of chromosomes. Interestingly, lincRNA genes appear to be much less susceptible to CNV in comparison to both protein-coding and intergenic regions (CNV affected segments in percentage: 1.8%, 37.5% and 60.6%, respectively. In summary, our study established a novel approach utilizing high-resolution SNP array to identify lincRNA candidates, which could functionally link to tumorigenesis, and provide new strategies for the diagnosis and treatment of breast cancer.

  17. High grade angiosarcoma fifteen years after breast conservation therapy with radiation therapy: A case report

    Directory of Open Access Journals (Sweden)

    William Boyan, Jr.

    2014-01-01

    CONCLUSION: Secondary breast angiosarcoma diagnosis requires frequent follow ups and a high index of suspicion. With mastectomy giving the best chance of treatment in these cases, early detection is crucial in this rare sequela.

  18. High Elevation Refugia for Bombus terricola (Hymenoptera: Apidae) Conservation and Wild Bees of the White Mountain National Forest

    Science.gov (United States)

    Tucker, Erika M.; Rehan, Sandra M.

    2017-01-01

    Many wild bee species are in global decline, yet much is still unknown about their diversity and contemporary distributions. National parks and forests offer unique areas of refuge important for the conservation of rare and declining species populations. Here we present the results of the first biodiversity survey of the bee fauna in the White Mountain National Forest (WMNF). More than a thousand specimens were collected from pan and sweep samples representing 137 species. Three species were recorded for the first time in New England and an additional seven species were documented for the first time in the state of New Hampshire. Four introduced species were also observed in the specimens collected. A checklist of the species found in the WMNF, as well as those found previously in Strafford County, NH, is included with new state records and introduced species noted as well as a map of collecting locations. Of particular interest was the relatively high abundance of Bombus terricola Kirby 1837 found in many of the higher elevation collection sites and the single specimen documented of Bombus fervidus (Fabricius 1798). Both of these bumble bee species are known to have declining populations in the northeast and are categorized as vulnerable on the International Union for Conservation of Nature’s Red List. PMID:28130453

  19. Low genetic diversity and high genetic differentiation in the critically endangered Omphalogramma souliei (Primulaceae):implications for its conservation

    Institute of Scientific and Technical Information of China (English)

    Yuan HUANG; Chang-Qin ZHANG; De-Zhu LI

    2009-01-01

    Omphalogramma souliei Franch. Is an endangered perennial herb only distributed in alpine areas of SW China. ISSR markers were applied to determine the genetic variation and genetic structure of 60 individuals of three populations of O. Souliei in NW Yunnan, China. The genetic diversity at the species level is low with P= 42.5% (percentage of polymorphic bands) and Hsp=0.1762 (total genetic diversity). However, a high level of genetic differentiation among populations was detected based on different measures (Nei's genetic diversity analysis: Gst=0.6038; AMOVA analysis: Fst=0.6797). Low level of genetic diversity within populations and significant genetic differentiation among populations might be due to the mixed mating system in which xenog-amy predominated and autogamy played an assistant role in O. Souliei. The genetic drift due to small population size and limited current gene flow also resulted in significant genetic differentiation. The assessment of genetic variation and differentiation of the endangered species provides important information for conservation on a genetic basis. Conservation strategies for this rare endemic species are proposed.

  20. Ice-binding site of snow mold fungus antifreeze protein deviates from structural regularity and high conservation.

    Science.gov (United States)

    Kondo, Hidemasa; Hanada, Yuichi; Sugimoto, Hiroshi; Hoshino, Tamotsu; Garnham, Christopher P; Davies, Peter L; Tsuda, Sakae

    2012-06-12

    Antifreeze proteins (AFPs) are found in organisms ranging from fish to bacteria, where they serve different functions to facilitate survival of their host. AFPs that protect freeze-intolerant fish and insects from internal ice growth bind to ice using a regular array of well-conserved residues/motifs. Less is known about the role of AFPs in freeze-tolerant species, which might be to beneficially alter the structure of ice in or around the host. Here we report the 0.95-Å high-resolution crystal structure of a 223-residue secreted AFP from the snow mold fungus Typhula ishikariensis. Its main structural element is an irregular β-helix with six loops of 18 or more residues that lies alongside an α-helix. β-Helices have independently evolved as AFPs on several occasions and seem ideally structured to bind to several planes of ice, including the basal plane. A novelty of the β-helical fold is the nonsequential arrangement of loops that places the N- and C termini inside the solenoid of β-helical coils. The ice-binding site (IBS), which could not be predicted from sequence or structure, was located by site-directed mutagenesis to the flattest surface of the protein. It is remarkable for its lack of regularity and its poor conservation in homologs from psychrophilic diatoms and bacteria and other fungi.

  1. Catchment-scale conservation units identified for the threatened Yarra pygmy perch (Nannoperca obscura in highly modified river systems.

    Directory of Open Access Journals (Sweden)

    Chris J Brauer

    Full Text Available Habitat fragmentation caused by human activities alters metapopulation dynamics and decreases biological connectivity through reduced migration and gene flow, leading to lowered levels of population genetic diversity and to local extinctions. The threatened Yarra pygmy perch, Nannoperca obscura, is a poor disperser found in small, isolated populations in wetlands and streams of southeastern Australia. Modifications to natural flow regimes in anthropogenically-impacted river systems have recently reduced the amount of habitat for this species and likely further limited its opportunity to disperse. We employed highly resolving microsatellite DNA markers to assess genetic variation, population structure and the spatial scale that dispersal takes place across the distribution of this freshwater fish and used this information to identify conservation units for management. The levels of genetic variation found for N. obscura are amongst the lowest reported for a fish species (mean heterozygosity of 0.318 and mean allelic richness of 1.92. We identified very strong population genetic structure, nil to little evidence of recent migration among demes and a minimum of 11 units for conservation management, hierarchically nested within four major genetic lineages. A combination of spatial analytical methods revealed hierarchical genetic structure corresponding with catchment boundaries and also demonstrated significant isolation by riverine distance. Our findings have implications for the national recovery plan of this species by demonstrating that N. obscura populations should be managed at a catchment level and highlighting the need to restore habitat and avoid further alteration of the natural hydrology.

  2. A high order special relativistic hydrodynamic code with space-time adaptive mesh refinement

    CERN Document Server

    Zanotti, Olindo

    2013-01-01

    We present a high order one-step ADER-WENO finite volume scheme with space-time adaptive mesh refinement (AMR) for the solution of the special relativistic hydrodynamics equations. By adopting a local discontinuous Galerkin predictor method, a high order one-step time discretization is obtained, with no need for Runge-Kutta sub-steps. This turns out to be particularly advantageous in combination with space-time adaptive mesh refinement, which has been implemented following a "cell-by-cell" approach. As in existing second order AMR methods, also the present higher order AMR algorithm features time-accurate local time stepping (LTS), where grids on different spatial refinement levels are allowed to use different time steps. We also compare two different Riemann solvers for the computation of the numerical fluxes at the cell interfaces. The new scheme has been validated over a sample of numerical test problems in one, two and three spatial dimensions, exploring its ability in resolving the propagation of relativ...

  3. Raptor Codes Performance Analysis on WI MAX Technology with high speed FFT/IFFT

    Directory of Open Access Journals (Sweden)

    Amit Kumar

    2012-04-01

    Full Text Available There are currently a large variety of wireless accessnetworks, including the emerging vehicular ad hocnetworks (VANETs. A large variety of applicationsutilizing these networks will demand features such as realtime,high-availability, and even instantaneous highbandwidthin some cases. Therefore, it is imperative fornetwork service providers to make the best possible use ofthe combined resources of available heterogeneousnetworks (wireless area networks (WLANs, UniversalMobile Telecommunications Systems, VANETs,Worldwide Interoperability for Microwave Access (WIMAX,etc. for connection support. When connectionsneed to migrate between heterogeneous networks forperformance and high-availability reasons, seamlessvertical handoff (VHO is a necessary first step. In the nearfuture, vehicular and other mobile applications will beexpected to have seamless VHO between heterogeneousaccess networks. Time-hopping ultra wideband (TH-UWBand direct-sequence ultra wideband (DS-UWB systems areamong the standards proposed for UWB communicationsscenarios. A general unified mathematical approach hasbeen proposed for calculating the bit error rate (BER forboth TH-UWB and DS-UWB systems in the presence ofmultiple-user interference and strong narrow-bandinterference in a multi-path scenario. Unlike many othermathematical models that provide upper or lower boundsfor BER, this model calculates the exact values for BER ingiven scenarios. A partial rake receiver has been chosen asthe receiving terminal. The modified Salem-Valenzuelachannel model has been used in this analysis. The modelcan assess the effect of any given narrow-band interferingsystems.

  4. Modeling Methods for the Main Switch of High Pulsed-Power Facilities Based on Transmission Line Code

    Science.gov (United States)

    Hu, Yixiang; Zeng, Jiangtao; Sun, Fengju; Wei, Hao; Yin, Jiahui; Cong, Peitian; Qiu, Aici

    2014-09-01

    Based on the transmission line code (TLCODE), a circuit model is developed here for analyses of main switches in the high pulsed-power facilities. With the structure of the ZR main switch as an example, a circuit model topology of the switch is proposed, and in particular, calculation methods of the dynamic inductance and resistance of the switching arc are described. Moreover, a set of closed equations used for calculations of various node voltages are theoretically derived and numerically discretized. Based on these discrete equations and the Matlab program, a simulation procedure is established for analyses of the ZR main switch. Voltages and currents at different key points are obtained, and comparisons are made with those of a PSpice L-C model. The comparison results show that these two models are perfectly in accord with each other with discrepancy less than 0.1%, which verifies the effectiveness of the TLCODE model to a certain extent.

  5. Benchmarking of 3D space charge codes using direct phase space measurements from photoemission high voltage DC gun

    CERN Document Server

    Bazarov, Ivan V; Gulliford, Colwyn; Li, Yulin; Liu, Xianghong; Sinclair, Charles K; Soong, Ken; Hannon, Fay

    2008-01-01

    We present a comparison between space charge calculations and direct measurements of the transverse phase space for space charge dominated electron bunches after a high voltage photoemission DC gun followed by an emittance compensation solenoid magnet. The measurements were performed using a double-slit setup for a set of parameters such as charge per bunch and the solenoid current. The data is compared with detailed simulations using 3D space charge codes GPT and Parmela3D with initial particle distributions created from the measured transverse and temporal laser profiles. Beam brightness as a function of beam fraction is calculated for the measured phase space maps and found to approach the theoretical maximum set by the thermal energy and accelerating field at the photocathode.

  6. Benchmarking of 3D space charge codes using direct phase space measurements from photoemission high voltage dc gun

    Directory of Open Access Journals (Sweden)

    Ivan V. Bazarov

    2008-10-01

    Full Text Available We present a comparison between space charge calculations and direct measurements of the transverse phase space of space charge dominated electron bunches from a high voltage dc photoemission gun followed by an emittance compensation solenoid magnet. The measurements were performed using a double-slit emittance measurement system over a range of bunch charge and solenoid current values. The data are compared with detailed simulations using the 3D space charge codes GPT and Parmela3D. The initial particle distributions were generated from measured transverse and temporal laser beam profiles at the photocathode. The beam brightness as a function of beam fraction is calculated for the measured phase space maps and found to approach within a factor of 2 the theoretical maximum set by the thermal energy and the accelerating field at the photocathode.

  7. A preliminary neutronic evaluation of high temperature engineering test reactor using the SCALE6 code

    Science.gov (United States)

    Tanure, L. P. A. R.; Sousa, R. V.; Costa, D. F.; Cardoso, F.; Veloso, M. A. F.; Pereira, C.

    2014-02-01

    Neutronic parameters of some fourth generation nuclear reactors have been investigated at the Departamento de Engenharia Nuclear/UFMG. Previous studies show the possibility to increase the transmutation capabilities of these fourth generation systems to achieve significant reduction concerning transuranic elements in spent fuel. To validate the studies, a benchmark on core physics analysis, related to initial testing of the High Temperature Engineering Test Reactor and provided by International Atomic Energy Agency (IAEA) was simulated using the Standardized Computer Analysis for Licensing Evaluation (SCALE). The CSAS6/KENO-VI control sequence and the 44-group ENDF/B-V 0 cross-section neutron library were used to evaluate the keff (effective multiplication factor) and the result presents good agreement with experimental value.

  8. Testing algebraic geometric codes

    Institute of Scientific and Technical Information of China (English)

    CHEN Hao

    2009-01-01

    Property testing was initially studied from various motivations in 1990's.A code C (∩)GF(r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector's coordinates.The problem of testing codes was firstly studied by Blum,Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs).How to characterize locally testable codes is a complex and challenge problem.The local tests have been studied for Reed-Solomon (RS),Reed-Muller (RM),cyclic,dual of BCH and the trace subcode of algebraicgeometric codes.In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions).We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.

  9. Chinese remainder codes

    Institute of Scientific and Technical Information of China (English)

    ZHANG Aili; LIU Xiufeng

    2006-01-01

    Chinese remainder codes are constructed by applying weak block designs and the Chinese remainder theorem of ring theory.The new type of linear codes take the congruence class in the congruence class ring R/I1 ∩ I2 ∩…∩ In for the information bit,embed R/Ji into R/I1 ∩ I2 ∩…∩ In,and assign the cosets of R/Ji as the subring of R/I1 ∩ I2 ∩…∩ In and the cosets of R/Ji in R/I1 ∩ I2 ∩…∩ In as check lines.Many code classes exist in the Chinese remainder codes that have high code rates.Chinese remainder codes are the essential generalization of Sun Zi codes.

  10. Chinese Remainder Codes

    Institute of Scientific and Technical Information of China (English)

    张爱丽; 刘秀峰; 靳蕃

    2004-01-01

    Chinese Remainder Codes are constructed by applying weak block designs and Chinese Remainder Theorem of ring theory. The new type of linear codes take the congruence class in the congruence class ring R/I1∩I2∩…∩In for the information bit, embed R/Ji into R/I1∩I2∩…∩In, and asssign the cosets of R/Ji as the subring of R/I1∩I2∩…∩In and the cosets of R/Ji in R/I1∩I2∩…∩In as check lines. There exist many code classes in Chinese Remainder Codes, which have high code rates. Chinese Remainder Codes are the essential generalization of Sun Zi Codes.

  11. Testing algebraic geometric codes

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Property testing was initially studied from various motivations in 1990’s. A code C  GF (r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector’s coordinates. The problem of testing codes was firstly studied by Blum, Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs). How to characterize locally testable codes is a complex and challenge problem. The local tests have been studied for Reed-Solomon (RS), Reed-Muller (RM), cyclic, dual of BCH and the trace subcode of algebraicgeometric codes. In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions). We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.

  12. Identification of a highly conserved valine-glycine-phenylalanine amino acid triplet required for HIV-1 Nef function

    Directory of Open Access Journals (Sweden)

    Meuwissen Pieter J

    2012-04-01

    Full Text Available Abstract Background The Nef protein of HIV facilitates virus replication and disease progression in infected patients. This role as pathogenesis factor depends on several genetically separable Nef functions that are mediated by interactions of highly conserved protein-protein interaction motifs with different host cell proteins. By studying the functionality of a series of nef alleles from clinical isolates, we identified a dysfunctional HIV group O Nef in which a highly conserved valine-glycine-phenylalanine (VGF region, which links a preceding acidic cluster with the following proline-rich motif into an amphipathic surface was deleted. In this study, we aimed to study the functional importance of this VGF region. Results The dysfunctional HIV group O8 nef allele was restored to the consensus sequence, and mutants of canonical (NL4.3, NA-7, SF2 and non-canonical (B2 and C1422 HIV-1 group M nef alleles were generated in which the amino acids of the VGF region were changed into alanines (VGF→AAA and tested for their capacity to interfere with surface receptor trafficking, signal transduction and enhancement of viral replication and infectivity. We found the VGF motif, and each individual amino acid of this motif, to be critical for downregulation of MHC-I and CXCR4. Moreover, Nef’s association with the cellular p21-activated kinase 2 (PAK2, the resulting deregulation of cofilin and inhibition of host cell actin remodeling, and targeting of Lck kinase to the trans-golgi-network (TGN were affected as well. Of particular interest, VGF integrity was essential for Nef-mediated enhancement of HIV virion infectivity and HIV replication in peripheral blood lymphocytes. For targeting of Lck kinase to the TGN and viral infectivity, especially the phenylalanine of the triplet was essential. At the molecular level, the VGF motif was required for the physical interaction of the adjacent proline-rich motif with Hck. Conclusion Based on these findings, we

  13. Ancient exaptation of a CORE-SINE retroposon into a highly conserved mammalian neuronal enhancer of the proopiomelanocortin gene.

    Directory of Open Access Journals (Sweden)

    Andrea M Santangelo

    2007-10-01

    Full Text Available The proopiomelanocortin gene (POMC is expressed in the pituitary gland and the ventral hypothalamus of all jawed vertebrates, producing several bioactive peptides that function as peripheral hormones or central neuropeptides, respectively. We have recently determined that mouse and human POMC expression in the hypothalamus is conferred by the action of two 5' distal and unrelated enhancers, nPE1 and nPE2. To investigate the evolutionary origin of the neuronal enhancer nPE2, we searched available vertebrate genome databases and determined that nPE2 is a highly conserved element in placentals, marsupials, and monotremes, whereas it is absent in nonmammalian vertebrates. Following an in silico paleogenomic strategy based on genome-wide searches for paralog sequences, we discovered that opossum and wallaby nPE2 sequences are highly similar to members of the superfamily of CORE-short interspersed nucleotide element (SINE retroposons, in particular to MAR1 retroposons that are widely present in marsupial genomes. Thus, the neuronal enhancer nPE2 originated from the exaptation of a CORE-SINE retroposon in the lineage leading to mammals and remained under purifying selection in all mammalian orders for the last 170 million years. Expression studies performed in transgenic mice showed that two nonadjacent nPE2 subregions are essential to drive reporter gene expression into POMC hypothalamic neurons, providing the first functional example of an exapted enhancer derived from an ancient CORE-SINE retroposon. In addition, we found that this CORE-SINE family of retroposons is likely to still be active in American and Australian marsupial genomes and that several highly conserved exonic, intronic and intergenic sequences in the human genome originated from the exaptation of CORE-SINE retroposons. Together, our results provide clear evidence of the functional novelties that transposed elements contributed to their host genomes throughout evolution.

  14. Land use and conservation reserve program effects on the persistence of playa wetlands in the High Plains.

    Science.gov (United States)

    Daniel, Dale W; Smith, Loren M; Haukos, David A; Johnson, Lacrecia A; McMurry, Scott T

    2014-04-15

    Watershed cultivation and subsequent soil erosion remains the greatest threat to the service provisioning of playa wetlands in the High Plains. The U.S. Department of Agriculture's (USDA) Conservation Reserve Program (CRP) plants perennial vegetation cover on cultivated lands including playa watersheds, and therefore, the program influences sediment deposition and accumulation in playas. Our objective was to measure the effects of the CRP on sediment deposition by comparing sediment depth and present/historic size characteristics in 258 playas among three High-Plains subregions (northern, central, and southern) and the three dominant watershed types: cropland, CRP, and native grassland. Sediment depth and resultant volume loss for CRP playas were 40% and 57% lower than cropland playas, but 68% and 76% greater than playas in native grassland. Playas in CRP had remaining volumes exceeding those of cropland playas. Grassland playas had nearly three times more original playa volume and 122% greater wetland area than CRP playas. Overall, playas were larger in the south than other subregions. Sediment depth was also three times greater in the south than the north, which resulted in southern playas losing twice as much total volume as northern playas. However, the larger southern playas provide more remaining volume per playa than those in other subregions. The results of this study demonstrate the importance of proper watershed management in preserving playa wetland ecosystem service provisioning in the High Plains. Furthermore, we identify regional differences in playas that may influence management decisions and provide valuable insight to conservation practitioners trying to maximize wetland services with limited resources.

  15. High Re-Operation Rates Using Conserve Metal-On-Metal Total Hip Articulations

    DEFF Research Database (Denmark)

    Mogensen, S L; Jakobsen, Thomas; Christoffersen, Hardy;

    2016-01-01

    INTRODUCTION: Metal-on-metal hip articulations have been intensely debated after reports of adverse reactions and high failure rates. The aim of this study was to retrospectively evaluate the implant of a metal-on.metal total hip articulation (MOM THA) from a single manufacture in a two...

  16. Comparison of three-dimensional optical coherence tomography and high resolution photography for art conservation studies.

    Science.gov (United States)

    Adler, Desmond C; Stenger, Jens; Gorczynska, Iwona; Lie, Henry; Hensick, Teri; Spronk, Ron; Wolohojian, Stephan; Khandekar, Narayan; Jiang, James Y; Barry, Scott; Cable, Alex E; Huber, Robert; Fujimoto, James G

    2007-11-26

    Gold punchwork and underdrawing in Renaissance panel paintings are analyzed using both three-dimensional swept source / Fourier domain optical coherence tomography (3D-OCT) and high resolution digital photography. 3D-OCT can generate en face images with micrometer-scale resolutions at arbitrary sectioning depths, rejecting out-of-plane light by coherence gating. Therefore 3D-OCT is well suited for analyzing artwork where a surface layer obscures details of interest. 3D-OCT also enables cross-sectional imaging and quantitative measurement of 3D features such as punch depth, which is beneficial for analyzing the tools and techniques used to create works of art. High volumetric imaging speeds are enabled by the use of a Fourier domain mode locked (FDML) laser as the 3D-OCT light source. High resolution infrared (IR) digital photography is shown to be particularly useful for the analysis of underdrawing, where the materials used for the underdrawing and paint layers have significantly different IR absrption properties. In general, 3D-OCT provides a more flexible and comprehensive analysis of artwork than high resolution photography, but also requires more complex instrumentation and data analysis.

  17. Significant Beneficial Association of High Dietary Selenium Intake with Reduced Body Fat in the CODING Study.

    Science.gov (United States)

    Wang, Yongbo; Gao, Xiang; Pedram, Pardis; Shahidi, Mariam; Du, Jianling; Yi, Yanqing; Gulliver, Wayne; Zhang, Hongwei; Sun, Guang

    2016-01-04

    Selenium (Se) is a trace element which plays an important role in adipocyte hypertrophy and adipogenesis. Some studies suggest that variations in serum Se may be associated with obesity. However, there are few studies examining the relationship between dietary Se and obesity, and findings are inconsistent. We aimed to investigate the association between dietary Se intake and a panel of obesity measurements with systematic control of major confounding factors. A total of 3214 subjects participated in the study. Dietary Se intake was determined from the Willett food frequency questionnaire. Body composition was measured using dual-energy X-ray absorptiometry. Obese men and women had the lowest dietary Se intake, being 24% to 31% lower than corresponding normal weight men and women, classified by both BMI and body fat percentage. Moreover, subjects with the highest dietary Se intake had the lowest BMI, waist circumference, and trunk, android, gynoid and total body fat percentages, with a clear dose-dependent inverse relationship observed in both gender groups. Furthermore, significant negative associations discovered between dietary Se intake and obesity measurements were independent of age, total dietary calorie intake, physical activity, smoking, alcohol, medication, and menopausal status. Dietary Se intake alone may account for 9%-27% of the observed variations in body fat percentage. The findings from this study strongly suggest that high dietary Se intake is associated with a beneficial body composition profile.

  18. A model for shear-band formation and high-explosive initiation in a hydrodynamics code

    Energy Technology Data Exchange (ETDEWEB)

    Kerrisk, J.F.

    1996-03-01

    This report describes work in progress to develop a shear band model for MESA-2D. The object of this work is (1) to predict the formation of shear bands and their temperature in high explosive (HE) during a MESA-2D calculation, (2) to then assess whether the HE would initiate, and (3) to allow a detonation wave initiated from a shear band to propagate. This requires developing a model that uses average cell data to estimate the size and temperature of narrow region (generally much narrower than the cell size) that is undergoing shear within the cell. The shear band temperature (rather than the average cell temperature) can be used to calculate the flow stress of the material in the cell or to calculate heat generation from reactive materials. Modifications have been made to MESA-2D to calculate shear band size and temperature, and to initiate HE detonation when conditions warrant. Two models have been used for shear-band size and temperature calculation, one based on an independent estimate of the shear band width and a second based on the temperature distribution around the shear band. Both models have been tested for calculations in which shear band formation occurs in steel. A comparison of the measured and calculated local temperature rise in a shear band has been made. A model for estimating the time to initiation of the HE based on the type of HE and the temperature distribution in a shear band has also been added to MESA-2D. Calculations of conditions needed to initiate HE in projectile-impact tests have been done and compared with experimental data. Further work is d to test the model.

  19. Transcriptional enhancers in protein-coding exons of vertebrate developmental genes.

    Directory of Open Access Journals (Sweden)

    Deborah I Ritter

    Full Text Available Many conserved noncoding sequences function as transcriptional enhancers that regulate gene expression. Here, we report that protein-coding DNA also frequently contains enhancers functioning at the transcriptional level. We tested the enhancer activity of 31 protein-coding exons, which we chose based on strong sequence conservation between zebrafish and human, and occurrence in developmental genes, using a Tol2 transposable GFP reporter assay in zebrafish. For each exon we measured GFP expression in hundreds of embryos in 10 anatomies via a novel system that implements the voice-recognition capabilities of a cellular phone. We find that 24/31 (77% exons drive GFP expression compared to a minimal promoter control, and 14/24 are anatomy-specific (expression in four anatomies or less. GFP expression driven by these coding enhancers frequently overlaps the anatomies where the host gene is expressed (60%, suggesting self-regulation. Highly conserved coding sequences and highly conserved noncoding sequences do not significantly differ in enhancer activity (coding: 24/31 vs. noncoding: 105/147 or tissue-specificity (coding: 14/24 vs. noncoding: 50/105. Furthermore, coding and noncoding enhancers display similar levels of the enhancer-related histone modification H3K4me1 (coding: 9/24 vs noncoding: 34/81. Meanwhile, coding enhancers are over three times as likely to contain an H3K4me1 mark as other exons of the host gene. Our work suggests that developmental transcriptional enhancers do not discriminate between coding and noncoding DNA and reveals widespread dual functions in protein-coding DNA.

  20. Evaluating Reduction of Sediment Pollution as a Strategy for Conservation of Coral Reef in High C02 World

    Science.gov (United States)

    Maina, J. M.; de Moel, H.; Mora, C.; Ward, P.; Watson, J.

    2014-12-01

    One of the key strategies for coral reef conservation in a high CO2 world is reduction of sediment and nutrient pollution. However, the reduction of sediment is a complicated planning issue as a result of the competing land uses from the demands to satisfy food production needs and from economic development, among others. Moreover, despite the significance of sedimentation as a threat to coral reefs, historical baseline and future estimates of sediment discharge on coral reefs remains poorly quantified. Therefore, the effectiveness of this strategy hinges upon (i) identifying the future sediment discharge on coral reefs relative to historical baseline, and (ii) on identifying spatially where sediment reduction actions are urgently needed and where they are likely to succeed. We provide this understanding by simulating sediment dynamics for historical and future time scales using models of land use and climate, for coastal watersheds adjacent coral reefs where they are found globally.

  1. Discovery of highly conserved unique peanut and tree nut peptides by LC-MS/MS for multi-allergen detection.

    Science.gov (United States)

    Sealey-Voyksner, Jennifer; Zweigenbaum, Jerry; Voyksner, Robert

    2016-03-01

    Proteins unique to peanuts and various tree nuts have been extracted, subjected to trypsin digestion and analysis by liquid chromatography/quadrupole time-of-flight mass spectrometry, in order to find highly conserved peptides that can be used as markers to detect peanuts and tree nuts in food. The marker peptide sequences chosen were those found to be present in both native (unroasted) and thermally processed (roasted) forms of peanuts and tree nuts. Each peptide was selected by assuring its presence in food that was processed or unprocessed, its abundance for sensitivity, sequence size, and uniqueness for peanut and each specific variety of tree nut. At least two peptides were selected to represent peanut, almond, pecan, cashew, walnut, hazelnut, pine nut, Brazil nut, macadamia nut, pistachio nut, chestnut and coconut; to determine the presence of trace levels of peanut and tree nuts in food by a novel multiplexed LC-MS method.

  2. Innovative use of controlled availability fertilizers with high performance for intensive agriculture and environmental conservation

    Institute of Scientific and Technical Information of China (English)

    Sadao SHOJI

    2005-01-01

    A variety of slow release fertilizers, controlled release (availability) fertilizers (CAFs),and stability fertilizers have been developed in response to the serious drawbacks of the conventional fertilizers since the early 1960's. Of these fertilizers, CAFs which are coated with resin are consumed in the largest quantity in the world. Selecting CAFs with higher performance, the author will discuss about: 1) Innovation of agro-technologies for various field crops including new concepts of fertilizer application, 2) high yielding of field crops, 3) enhancing quality and safety of farm products, and 4) controlling the adverse effect of intensive agriculture on the environment.

  3. Molecular evolution of vertebrate neurotrophins: co-option of the highly conserved nerve growth factor gene into the advanced snake venom arsenalf.

    Directory of Open Access Journals (Sweden)

    Kartik Sunagar

    Full Text Available Neurotrophins are a diverse class of structurally related proteins, essential for neuronal development, survival, plasticity and regeneration. They are characterized by major family members, such as the nerve growth factors (NGF, brain-derived neurotrophic factors (BDNF and neurotrophin-3 (NT-3, which have been demonstrated here to lack coding sequence variations and follow the regime of negative selection, highlighting their extremely important conserved role in vertebrate homeostasis. However, in stark contrast, venom NGF secreted as part of the chemical arsenal of the venomous advanced snake family Elapidae (and to a lesser extent Viperidae have characteristics consistent with the typical accelerated molecular evolution of venom components. This includes a rapid rate of diversification under the significant influence of positive-selection, with the majority of positively-selected sites found in the secreted β-polypeptide chain (74% and on the molecular surface of the protein (92%, while the core structural and functional residues remain highly constrained. Such focal mutagenesis generates active residues on the toxin molecular surface, which are capable of interacting with novel biological targets in prey to induce a myriad of pharmacological effects. We propose that caenophidian NGFs could participate in prey-envenoming by causing a massive release of chemical mediators from mast cells to mount inflammatory reactions and increase vascular permeability, thereby aiding the spread of other toxins and/or by acting as proapoptotic factors. Despite their presence in reptilian venom having been known for over 60 years, this is the first evidence that venom-secreted NGF follows the molecular evolutionary pattern of other venom components, and thus likely participates in prey-envenomation.

  4. A highly conserved Poc1 protein characterized in embryos of the hydrozoan Clytia hemisphaerica: localization and functional studies.

    Directory of Open Access Journals (Sweden)

    Cécile Fourrage

    Full Text Available Poc1 (Protein of Centriole 1 proteins are highly conserved WD40 domain-containing centriole components, well characterized in the alga Chlamydomonas, the ciliated protazoan Tetrahymena, the insect Drosophila and in vertebrate cells including Xenopus and zebrafish embryos. Functions and localizations related to the centriole and ciliary axoneme have been demonstrated for Poc1 in a range of species. The vertebrate Poc1 protein has also been reported to show an additional association with mitochondria, including enrichment in the specialized "germ plasm" region of Xenopus oocytes. We have identified and characterized a highly conserved Poc1 protein in the cnidarian Clytia hemisphaerica. Clytia Poc1 mRNA was found to be strongly expressed in eggs and early embryos, showing a punctate perinuclear localization in young oocytes. Fluorescence-tagged Poc1 proteins expressed in developing embryos showed strong localization to centrioles, including basal bodies. Anti-human Poc1 antibodies decorated mitochondria in Clytia, as reported in human cells, but failed to recognise endogenous or fluorescent-tagged Clytia Poc1. Injection of specific morpholino oligonucleotides into Clytia eggs prior to fertilization to repress Poc1 mRNA translation interfered with cell division from the blastula stage, likely corresponding to when neosynthesis normally takes over from maternally supplied protein. Cell cycle lengthening and arrest were observed, phenotypes consistent with an impaired centriolar biogenesis or function. The specificity of the defects could be demonstrated by injection of synthetic Poc1 mRNA, which restored normal development. We conclude that in Clytia embryos, Poc1 has an essentially centriolar localization and function.

  5. Extension of the reactor dynamics code MGT-3D for pebblebed and blocktype high-temperature-reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Dunfu

    2015-07-01

    The High Temperature Gas cooled Reactor (HTGR) is an improved, gas cooled nuclear reactor. It was chosen as one of the candidates of generation IV nuclear plants [1]. The reactor can be shut down automatically because of the negative reactivity feedback due to the temperature's increasing in designed accidents. It is graphite moderated and Helium cooled. The residual heat can be transferred out of the reactor core by inactive ways as conduction, convection, and thermal radiation during the accident. In such a way, a fuel temperature does not go beyond a limit at which major fission product release begins. In this thesis, the coupled neutronics and fluid mechanics code MGT-3D used for the steady state and time-dependent simulation of HTGRs, is enhanced and validated [2]. The fluid mechanics part is validated by SANA experiments in steady state cases as well as transient cases. The fuel temperature calculation is optimized by solving the heat conduction equation of the coated particles. It is applied in the steady state and transient simulation of PBMR, and the results are compared to the simulation with the old overheating model. New approaches to calculate the temperature profile of the fuel element of block-type HTGRs, and the calculation of the homogeneous conductivity of composite materials are introduced. With these new developments, MGT-3D is able to simulate block-type HTGRs as well. This extended MGT-3D is used to simulate a cuboid ceramic block heating experiment in the NACOK-II facility. The extended MGT-3D is also applied to LOFC and DLOFC simulation of GT-MHR. It is a fluid mechanics calculation with a given heat source. This calculation result of MGT-3D is verified with the calculation results of other codes. The design of the Japanese HTTR is introduced. The deterministic simulation of the LOFC experiment of HTTR is conducted with the Monte-Carlo code Serpent and MGT-3D, which is the LOFC Project organized by OECD/NEA [3]. With Serpent the burnup

  6. Computation and Analysis of High Rocky Slope Safety in a Water Conservancy Project

    Directory of Open Access Journals (Sweden)

    Meng Yang

    2015-01-01

    Full Text Available An integrated method, covering the actual monitoring analysis, practical geological model, and theoretical mathematical simulation model, is systematically proposed and successfully applied. Deformation characteristic of a unique high rocky slope was firstly analyzed from multiple angles and multiple layers by changeable elevations and distances. Arrangements of monitoring points were listed and monitoring equipment was designed to comprise a complete monitoring system. Present larger displacement was concluded for bottom larger displacement caused by water erosion and middle larger displacement formed by seepage. Temporal and spatial displacements rule study of multiple-points linkage effects with water factor proved this conclusion. To better excavate useful message and analyze the deep rule from the practical monitoring data, the slope geological model was conducted and rock mechanic parameters were researched. Finally, a unique three-dimensional finite element model was applied to approach the structure character using numerical simulations. The corresponding strength criterion was used to determine the safety coefficient by selecting a typical section. Subsequently, an integrated three-dimensional finite element model of the slope and dam was developed and more detailed deformation evolution mechanism was revealed. This study is expected to provide a powerful and systematic method to analyze very high, important, and dangerous slopes.

  7. Training program for energy conservation in new-building construction. Volume I. Energy conservation technology: management and energy conservation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    A Model Code for Energy Conservation in New Building Construction was developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. This training manual is both an introduction to the need for energy conservation in buildings and a definition of the need for and the role of the enforcement official for energy conservation.

  8. Numerical Zooming Between a NPSS Engine System Simulation and a One-Dimensional High Compressor Analysis Code

    Science.gov (United States)

    Follen, Gregory; auBuchon, M.

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.

  9. Cross-Breeding Is Inevitable to Conserve the Highly Inbred Population of Puffin Hunter: The Norwegian Lundehund

    Science.gov (United States)

    Daverdin, Marc; Helfjord, Turid; Berg, Peer

    2017-01-01

    The Norwegian Lundehund is a highly endangered native dog breed. Low fertility and high frequency predisposition to intestinal disorder imply inbreeding depression. We assessed the genetic diversity of the Lundehund population from pedigree data and evaluated the potential of optimal contribution selection and cross-breeding in the long-term management of the Lundehund population. The current Norwegian Lundehund population is highly inbred and has lost 38.8% of the genetic diversity in the base population. Effective population size estimates varied between 13 and 82 depending on the method used. Optimal contribution selection alone facilitates no improvement in the current situation in the Lundehund due to the extremely high relatedness of the whole population. Addition of (replacement with) 10 breeding candidates of foreign breed to 30 Lundehund breeders reduced the parental additive genetic relationship by 40–42% (48–53%). Immediate actions are needed to increase the genetic diversity in the current Lundehund population. The only option to secure the conservation of this rare breed is to introduce individuals from foreign breeds as breeding candidates. PMID:28107382

  10. A study of fuel failure behavior in high burnup HTGR fuel. Analysis by STRESS3 and STAPLE codes

    Energy Technology Data Exchange (ETDEWEB)

    Martin, David G.; Sawa, Kazuhiro; Ueta, Shouhei; Sumita, Junya [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-05-01

    In current high temperature gas-cooled reactors (HTGRs), Tri-isotropic coated fuel particles are employed as fuel. In safety design of the HTGR fuels, it is important to retain fission products within particles so that their release to primary coolant does not exceed an acceptable level. From this point of view, the basic design criteria for the fuel are to minimize the failure fraction of as-fabricated fuel coating layers and to prevent significant additional fuel failures during operation. This report attempts to model fuel behavior in irradiation tests using the U.K. codes STRESS3 and STAPLE. Test results in 91F-1A and HRB-22 capsules irradiation tests, which were carried out at the Japan Materials Testing Reactor of JAERI and at the High Flux Isotope Reactor of Oak Ridge National Laboratory, respectively, were employed in the calculation. The maximum burnup and fast neutron fluence were about 10%FIMA and 3 x 10{sup 25} m{sup -2}, respectively. The fuel for the irradiation tests was called high burnup fuel, whose target burnup and fast neutron fluence were higher than those of the first-loading fuel of the High Temperature Engineering Test Reactor. The calculation results demonstrated that if only mean fracture stress values of PyC and SiC are used in the calculation it is not possible to predict any particle failures, by which is meant when all three load bearing layers have failed. By contrast, when statistical variations in the fracture stresses and particle specifications are taken into account, as is done in the STAPLE code, failures can be predicted. In the HRB-22 irradiation test, it was concluded that the first two particles which had failed were defective in some way, but that the third and fourth failures can be accounted for by the pressure vessel model. In the 91F-1A irradiation test, the result showed that 1 or 2 particles had failed towards the end of irradiation in the upper capsule and no particles failed in the lower capsule. (author)

  11. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  12. Reversible Self-Assembly of Hydrophilic Inorganic Polyelectrolytes into Highly Conservative, Vesicle-like Structures

    Science.gov (United States)

    Kistler, Melissa; Bhatt, Anish; Liu, Guang; Liu, Tianbo

    2007-03-01

    The hydrophilic polyoxometalate (POM) macroanions are inorganic polyelectrolytes which offer a direct connection between simple ions and organic polyelectrolytes. POM solutions are perfect model systems for studying polyelectrolyte solutions because they are identical in size, shape, mass and charges, with easily tunable charge density. Many types of POM macroanions are highly soluble but undergo reversible self-assembly to form uniform, stable, soft, single-layer vesicle-like ``blackberry'' structures containing >1000 individual POMs in dilute solutions. The driving force of the blackberry formation is likely counterion-mediated attraction (like-charge attraction). The blackberry size can be accurately controlled by solvent quality, or the charge density on macroions. Many unexpected phenomena have been observed in these novel systems. Blackberry structures may be analogous to virus shell structures formed by capsid proteins. References: Nature, 2003, 426, 59; JACS, 2002, 124, 10942; 2003, 125, 312; 2004, 126, 16690; 2005, 127, 6942; 2006, 128, 10103.

  13. A HIGH ORDER ADAPTIVE FINITE ELEMENT METHOD FOR SOLVING NONLINEAR HYPERBOLIC CONSERVATION LAWS

    Institute of Scientific and Technical Information of China (English)

    Zhengfu Xu; Jinchao Xu; Chi-Wang Shu

    2011-01-01

    In this note,we apply the h-adaptive streamline diffusion finite element method with a small mesh-dependent artificial viscosity to solve nonlinear hyperbolic partial differential equations,with the objective of achieving high order accuracy and mesh efficiency.We compute the numerical solution to a steady state Burgers equation and the solution to a converging-diverging nozzle problem.The computational results verify that,by suitably choosing the artificial viscosity coefficient and applying the adaptive strategy based on a posterior error estimate by Johnson et al.,an order of N-3/2 accuracy can be obtained when continuous piecewise linear elements are used,where N is the number of elements.

  14. The highly conserved Escherichia coli transcription factor YhaJ regulates aromatic compound degradation

    Directory of Open Access Journals (Sweden)

    Noa Palevsky

    2016-09-01

    Full Text Available The aromatic compound 2,4-dinitrotoluene (DNT, a common impurity in 2,4,6-trinitrotoluene (TNT production, has been suggested as a tracer for the presence of TNT-based landmines due to its stability and high volatility. We have previously described an Escherichia coli bioreporter capable of detecting the presence of DNT vapors, harboring a fusion of the yqjF gene promoter, to a reporter element. However, the DNT metabolite, which is the direct inducer of yqjF, has not yet been identified, nor has the regulatory mechanism of the induction been clarified. We demonstrate here that the YhaJ protein, a member of the LysR type family, acts as a transcriptional regulator of yqjF activation, as well as of a panel of additional E. coli genes. This group of genes share a common sequence motif in their promoters, which is suggested here as a putative YhaJ-box. In addition, we have linked YhaJ to the regulation of quinol-like compound degradation in the cell, and identified yhaK as playing a role in the degradation of DNT.

  15. Two strains of Crocosphaera watsonii with highly conserved genomes are distinguished by strain-specific features

    Directory of Open Access Journals (Sweden)

    Shellie Roxanne Bench

    2011-12-01

    Full Text Available Unicellular nitrogen-fixing cyanobacteria are important components of marine phytoplankton. Although non-nitrogen-fixing marine phytoplankton generally exhibit high gene sequence and genomic diversity, gene sequences of natural populations and isolated strains of Crocosphaera watsonii, one of two most abundant open ocean unicellular cyanobacteria groups, have been shown to be 98-100% identical.. The low sequence diversity in Crocosphaera is a dramatic contrast to sympatric species of Prochlorococcus and Synechococcus, and raises the question of how genome differences can explain observed phenotypic diversity among Crocosphaera strains. Here we show, through whole genome comparisons of two phenotypically different strains, that there are strain-specific sequences in each genome, and numerous genome rearrangements, despite exceptionally low sequence diversity in shared genomic regions. Some of the strain-specific sequences encode functions that explain observed phenotypic differences, such as exopolysaccharide biosynthesis. The pattern of strain-specific sequences distributed throughout the genomes, along with rearrangements in shared sequences is evidence of significant genetic mobility that may be attributed to the hundreds of transposase genes found in both strains. Furthermore, such genetic mobility appears to be the main mechanism of strain divergence in Crocosphaera which do not accumulate DNA microheterogeneity over the vast majority of their genomes. The strain-specific sequences found in this study provide tools for future physiological studies, as well as genetic markers to help determine the relative abundance of phenotypes in natural populations.

  16. Genomic analysis of six new Geobacillus strains reveals highly conserved carbohydrate degradation architectures and strategies

    Directory of Open Access Journals (Sweden)

    Phillip eBrumm

    2015-05-01

    Full Text Available In this work we report the whole genome sequences of six new Geobacillus xylanolytic strains along with the genomic analysis of their capability to degrade carbohydrates.. The six sequenced Geobacillus strains described here have a range of GC contents from 43.9% to 52.5% and clade with named Geobacillus species throughout the entire genus. We have identified a ~200 kb unique super-cluster in all six strains, containing five to eight distinct carbohydrate degradation clusters in a single genomic region, a feature not seen in other genera. The Geobacillus strains rely on a small number of secreted enzymes located within distinct clusters for carbohydrate utilization, in contrast to most biomass-degrading organisms which contain numerous secreted enzymes located randomly throughout the genomes. All six strains are able to utilize fructose, arabinose, xylose, mannitol, gluconate, xylan, and α-1,6-glucosides. The gene clusters for utilization of these seven substrates have identical organization and the individual proteins have a high percent identity to their homologs. The strains show significant differences in their ability to utilize inositol, sucrose, lactose, α-mannosides, α-1,4-glucosides and arabinan.

  17. Development of safety analysis codes and experimental validation for a very high temperature gas-cooled reactor Final report

    Energy Technology Data Exchange (ETDEWEB)

    Chang Oh

    2006-03-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gasses (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. Research Objectives As described above, a pipe break may lead to significant fuel damage and fission product release in the VHTR. The objectives of this Korean/United States collaboration were to develop and validate advanced computational methods for VHTR safety analysis. The methods that have been developed are now

  18. COLOR SUPERCONDUCTIVITY, INSTANTONS AND PARITY (NON?)-CONSERVATION AT HIGH BARYON DENSITY-VOLUME 5.

    Energy Technology Data Exchange (ETDEWEB)

    GYULASSY,M.

    1997-11-11

    This one day Riken BNL Research Center workshop was organized to follow-up on the rapidly developing theoretical work on color super-conductivity, instanton dynamics, and possible signatures of parity violation in strong interactions that was stimulated by the talk of Frank Wilczek during the Riken BNL September Symposium. The workshop was held on November 11, 1997 at the center with over 30 participants. The program consisted of four talks on theory in the morning followed by two talks in the afternoon by experimentalists and open discussion. Krishna Rajagopal (MIT) first reviewed the status of the chiral condensate calculations at high baryon density within the instanton model and the percolation transition at moderate densities restoring chiral symmetry. Mark Alford (Princeton) then discussed the nature of the novel color super-conducting diquark condensates. The main result was that the largest gap on the order of 100 MeV was found for the 0{sup +} condensate, with only a tiny gap << MeV for the other possible 1{sup +}. Thomas Schaefer (INT) gave a complete overview of the instanton effects on correlators and showed independent calculations in collaboration with Shuryak (SUNY) and Velkovsky (BNL) confirming the updated results of the Wilczek group (Princeton, MIT). Yang Pang (Columbia) addressed the general question of how breaking of discrete symmetries by any condensate with suitable quantum numbers could be searched for experimentally especially at the AGS through longitudinal A polarization measurements. Nicholas Samios (BNL) reviewed the history of measurements on {Lambda} polarization and suggested specific kinematical variables for such analysis. Brian Cole (Columbia) showed recent E910 measurements of {Lambda} production at the AGS in nuclear collisions and focused on the systematic biases that must be considered when looking for small symmetry breaking effects. Lively discussions led by Robert Jaffe (MIT) focused especially on speculations on the still

  19. Sharing code

    OpenAIRE

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  20. Assessing the significance of conserved genomic aberrations using high resolution genomic microarrays.

    Directory of Open Access Journals (Sweden)

    Mitchell Guttman

    2007-08-01

    Full Text Available Genomic aberrations recurrent in a particular cancer type can be important prognostic markers for tumor progression. Typically in early tumorigenesis, cells incur a breakdown of the DNA replication machinery that results in an accumulation of genomic aberrations in the form of duplications, deletions, translocations, and other genomic alterations. Microarray methods allow for finer mapping of these aberrations than has previously been possible; however, data processing and analysis methods have not taken full advantage of this higher resolution. Attention has primarily been given to analysis on the single sample level, where multiple adjacent probes are necessarily used as replicates for the local region containing their target sequences. However, regions of concordant aberration can be short enough to be detected by only one, or very few, array elements. We describe a method called Multiple Sample Analysis for assessing the significance of concordant genomic aberrations across multiple experiments that does not require a-priori definition of aberration calls for each sample. If there are multiple samples, representing a class, then by exploiting the replication across samples our method can detect concordant aberrations at much higher resolution than can be derived from current single sample approaches. Additionally, this method provides a meaningful approach to addressing population-based questions such as determining important regions for a cancer subtype of interest or determining regions of copy number variation in a population. Multiple Sample Analysis also provides single sample aberration calls in the locations of significant concordance, producing high resolution calls per sample, in concordant regions. The approach is demonstrated on a dataset representing a challenging but important resource: breast tumors that have been formalin-fixed, paraffin-embedded, archived, and subsequently UV-laser capture microdissected and hybridized to two