WorldWideScience

Sample records for code doc informal

  1. Theory of Coding Informational Simulation.

    Science.gov (United States)

    1981-04-06

    Possibility of Applying the Code Juice for the Construction of Combinatory Switches, by V. G. Yevstigneyev. 164 Organization of the Structure of the...Ip; p, - the basis/tase of range P and i=1, n.’-.I With the multiplication cf fractions A&/P and A2/P dcps ap:.*.a: fraction A/P2, where A- Alo &. DOC...a. a r. A ML I~mma~pa u"ai~TsXA. M., SMIIDO, 1966 *O I> DOC =81024105 PAGE A Page 97. The possibility of applying the ccde juice for the construction

  2. 75 FR 12496 - Proposed Information Collection; Comment Request; DOC National Environmental Policy Act...

    Science.gov (United States)

    2010-03-16

    ... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF COMMERCE Office of the Secretary Proposed Information Collection; Comment Request; DOC National Environmental Policy Act Environmental Questionnaire and Checklist AGENCY: Office of the Secretary, Office of...

  3. Informal control code logic

    NARCIS (Netherlands)

    Bergstra, J.A.

    2010-01-01

    General definitions as well as rules of reasoning regarding control code production, distribution, deployment, and usage are described. The role of testing, trust, confidence and risk analysis is considered. A rationale for control code testing is sought and found for the case of safety critical

  4. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  5. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  6. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    Shannon's landmark paper 'A Mathematical Theory of. Communication' [1] laid the foundation for communica- tion and information theory as they are perceived to- day. In [1], Shannon considers two particular problems: one of efficiently describing a source that outputs a se- quence of symbols each occurring with some ...

  7. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  8. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  9. Theory of information and coding. A mathematical framework for communication

    Energy Technology Data Exchange (ETDEWEB)

    McEliece, R.J.

    1977-01-01

    This book is meant to be a self-contained introduction to the basic results in the theory of information and coding. The introduction gives an overview of the whole sujbect. Chapters in Part I (Information Theory) deal with entropy and mutual information, discrete memoryless channels and their capacity--cost functions, discrete memoryless sources and their rate-distortion functions, the Gaussian channel and source, the source--channel coding theorem, and advanced topics (the channel coding theorem, the source coding theorem). The chapters in Part II (Coding Theory) discuss linear codes; BCH, Goppa, and related codes; convolutional codes; variable-length source coding; and advanced topics (block codes, convolutional codes, a comparison of block and convolutional codes, source codes). 86 figures, 9 tables, 50 references. (RWR)

  10. Information theory and coding solved problems

    CERN Document Server

    Ivaniš, Predrag

    2017-01-01

    This book is offers a comprehensive overview of information theory and error control coding, using a different approach then in existed literature. The chapters are organized according to the Shannon system model, where one block affects the others. A relatively brief theoretical introduction is provided at the beginning of every chapter, including a few additional examples and explanations, but without any proofs. And a short overview of some aspects of abstract algebra is given at the end of the corresponding chapters. The characteristic complex examples with a lot of illustrations and tables are chosen to provide detailed insights into the nature of the problem. Some limiting cases are presented to illustrate the connections with the theoretical bounds. The numerical values are carefully selected to provide in-depth explanations of the described algorithms. Although the examples in the different chapters can be considered separately, they are mutually connected and the conclusions for one considered proble...

  11. Reducing Computational Overhead of Network Coding with Intrinsic Information Conveying

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Pedersen, Morten V.

    This paper investigated the possibility of intrinsic information conveying in network coding systems. The information is embedded into the coding vector by constructing the vector based on a set of predefined rules. This information can subsequently be retrieved by any receiver. The starting point...... is RLNC (Random Linear Network Coding) and the goal is to reduce the amount of coding operations both at the coding and decoding node, and at the same time remove the need for dedicated signaling messages. In a traditional RLNC system, coding operation takes up significant computational resources and adds...... to the overall energy consumption, which is particular problematic for mobile battery-driven devices. In RLNC coding is performed over a FF (Finite Field). We propose to divide this field into sub fields, and let each sub field signify some information or state. In order to embed the information correctly...

  12. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...... consists of a variable block size based Y, U and V component motion estimation and an adaptive weighted overlapped block motion compensation (OBMC). The proposal is tested and compared with the results of an executable DVC codec released by DISCOVER group (DIStributed COding for Video sERvices). RD...

  13. The duality of coding assessment information

    African Journals Online (AJOL)

    Erna Kinsey

    One of the greatest problems concerning outcomes that address knowledge, skills and values is to determine and qualify different types of assessment information. This article examines the dichotomy of determining or qualifying, i.e. grading or portraying assessment information. The article investigates, firstly, the setting of ...

  14. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    Science.gov (United States)

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  15. [The Medical Information Systems Project clinical coding and surgeons: why should surgeons code and how?].

    Science.gov (United States)

    Bensadoun, H

    2001-02-01

    The clinical coding system recently instituted in France, the PMSI (Projet de Médicalisation du Système d'Information), has become an unavoidable element in funding allocations for short-term private and public hospitalization centers. Surgeons must take into serious consideration this controversial medicoeconomic instrument. Coding is a dire time-consuming task but, like the hospitalization or surgery report, is an essential part of the discharge procedure. Coding can in the long run be used to establish pricing by pathology. Surgeons should learn the rules and the logic behind this coding system: which, not being based on a medical rationale, may be somewhat difficult to understand. Choosing the right main diagnosis and the comobidity Items is crucial. Quality homogeneous coding is essential if one expects the health authorities to make good use of the system. Our medical societies have a role to play in promoting and harmonizing the coding technique.

  16. On Coding of Scheduling Information in OFDM

    OpenAIRE

    Gunnarsson, Fredrik; Moosavi, Reza; Eriksson, Jonas; Larsson, Erik G.; Wiberg, Niklas; Frenger, Pål

    2009-01-01

    Control signaling strategies for scheduling information in cellular OFDM systems are studied. A single-cell multiuser system model is formulated that provides system capacity estimates accounting for the signaling overhead. Different scheduling granularities are considered, including the one used in the specifications for the 3G Long Term Evolution (LTE). A greedy scheduling method is assumed, where each resource is assigned to the user for which it can support the highest number of bits. The...

  17. The duality of coding assessment information | Kotze | South African ...

    African Journals Online (AJOL)

    This article examines the dichotomy of determining or qualifying, i.e. grading or portraying assessment information. The article investigates, firstly, the setting of practical criteria and, secondly, the adequacy of assessment criteria in guiding the judgments of the assessor. A possible guide for coding assessment information is ...

  18. Locally decodable codes and private information retrieval schemes

    CERN Document Server

    Yekhanin, Sergey

    2010-01-01

    Locally decodable codes (LDCs) are codes that simultaneously provide efficient random access retrieval and high noise resilience by allowing reliable reconstruction of an arbitrary bit of a message by looking at only a small number of randomly chosen codeword bits. Local decodability comes with a certain loss in terms of efficiency - specifically, locally decodable codes require longer codeword lengths than their classical counterparts. Private information retrieval (PIR) schemes are cryptographic protocols designed to safeguard the privacy of database users. They allow clients to retrieve rec

  19. DocSaludMental en netvibes

    OpenAIRE

    Onís, Ricardo

    2011-01-01

    DocSaludMental es un escritorio virtual que, a partir de la plataforma netvibes, permite el acceso a información especializada y actualizada mediante la sindicación de contenidos a todos los profesionales de la red de salud mental en el Principado de Asturias DocSaludMental is a virtual desktop that allows access to expert information and updated from content syndication to all professionals in the mental health network in the Principality of Asturias

  20. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Smith, Ralph [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Williams, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Figueroa, Victor [Sandia National Laboratories, Albuquerque, NM 87185 (United States)

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  1. Tetrahedral gray code for visualization of genome information.

    Science.gov (United States)

    Ichinose, Natsuhiro; Yada, Tetsushi; Gotoh, Osamu

    2014-01-01

    We propose a tetrahedral Gray code that facilitates visualization of genome information on the surfaces of a tetrahedron, where the relative abundance of each [Formula: see text]-mer in the genomic sequence is represented by a color of the corresponding cell of a triangular lattice. For biological significance, the code is designed such that the [Formula: see text]-mers corresponding to any adjacent pair of cells differ from each other by only one nucleotide. We present a simple procedure to draw such a pattern on the development surfaces of a tetrahedron. The thus constructed tetrahedral Gray code can demonstrate evolutionary conservation and variation of the genome information of many organisms at a glance. We also apply the tetrahedral Gray code to the honey bee (Apis mellifera) genome to analyze its methylation structure. The results indicate that the honey bee genome exhibits CpG overrepresentation in spite of its methylation ability and that two conserved motifs, CTCGAG and CGCGCG, in the unmethylated regions are responsible for the overrepresentation of CpG.

  2. Tetrahedral gray code for visualization of genome information.

    Directory of Open Access Journals (Sweden)

    Natsuhiro Ichinose

    Full Text Available We propose a tetrahedral Gray code that facilitates visualization of genome information on the surfaces of a tetrahedron, where the relative abundance of each [Formula: see text]-mer in the genomic sequence is represented by a color of the corresponding cell of a triangular lattice. For biological significance, the code is designed such that the [Formula: see text]-mers corresponding to any adjacent pair of cells differ from each other by only one nucleotide. We present a simple procedure to draw such a pattern on the development surfaces of a tetrahedron. The thus constructed tetrahedral Gray code can demonstrate evolutionary conservation and variation of the genome information of many organisms at a glance. We also apply the tetrahedral Gray code to the honey bee (Apis mellifera genome to analyze its methylation structure. The results indicate that the honey bee genome exhibits CpG overrepresentation in spite of its methylation ability and that two conserved motifs, CTCGAG and CGCGCG, in the unmethylated regions are responsible for the overrepresentation of CpG.

  3. Information-Theoretic Bounds and Approximations in Neural Population Coding.

    Science.gov (United States)

    Huang, Wentao; Zhang, Kechen

    2018-01-17

    While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This article focuses on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and approximation formulas that remain valid in high-dimensional spaces. We prove that optimizing the population density distribution based on these approximation formulas is a convex optimization problem that allows efficient numerical solutions. Numerical simulation results confirmed that our asymptotic formulas were highly accurate for approximating mutual information for large neural populations. In special cases, the approximation formulas are exactly equal to the true mutual information. We also discuss techniques of variable transformation and dimensionality reduction to facilitate computation of the approximations.

  4. Shannon information entropy in the canonical genetic code.

    Science.gov (United States)

    Nemzer, Louis R

    2017-02-21

    The Shannon entropy measures the expected information value of messages. As with thermodynamic entropy, the Shannon entropy is only defined within a system that identifies at the outset the collections of possible messages, analogous to microstates, that will be considered indistinguishable macrostates. This fundamental insight is applied here for the first time to amino acid alphabets, which group the twenty common amino acids into families based on chemical and physical similarities. To evaluate these schemas objectively, a novel quantitative method is introduced based the inherent redundancy in the canonical genetic code. Each alphabet is taken as a separate system that partitions the 64 possible RNA codons, the microstates, into families, the macrostates. By calculating the normalized mutual information, which measures the reduction in Shannon entropy, conveyed by single nucleotide messages, groupings that best leverage this aspect of fault tolerance in the code are identified. The relative importance of properties related to protein folding - like hydropathy and size - and function, including side-chain acidity, can also be estimated. This approach allows the quantification of the average information value of nucleotide positions, which can shed light on the coevolution of the canonical genetic code with the tRNA-protein translation mechanism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Eco-evolutionary dynamics, coding structure and the information threshold

    Directory of Open Access Journals (Sweden)

    Hogeweg Paulien

    2010-11-01

    Full Text Available Abstract Background The amount of information that can be maintained in an evolutionary system of replicators is limited by genome length, the number of errors during replication (mutation rate and various external factors that influence the selection pressure. To date, this phenomenon, known as the information threshold, has been studied (both genotypically and phenotypically in a constant environment and with respect to maintenance (as opposed to accumulation of information. Here we take a broader perspective on this problem by studying the accumulation of information in an ecosystem, given an evolvable coding structure. Moreover, our setup allows for individual based as well as ecosystem based solutions. That is, all functions can be performed by individual replicators, or complementing functions can be performed by different replicators. In this setup, where both the ecosystem and the individual genomes can evolve their structure, we study how populations cope with high mutation rates and accordingly how the information threshold might be alleviated. Results We observe that the first response to increased mutation rates is a change in coding structure. At moderate mutation rates evolution leads to longer genomes with a higher diversity than at high mutation rates. Thus, counter-intuitively, at higher mutation rates diversity is reduced and the efficacy of the evolutionary process is decreased. Therefore, moderate mutation rates allow for more degrees of freedom in exploring genotype space during the evolutionary trajectory, facilitating the emergence of solutions. When an individual based solution cannot be attained due to high mutation rates, spatial structuring of the ecosystem can accommodate the evolution of ecosystem based solutions. Conclusions We conclude that the evolutionary freedom (eg. the number of genotypes that can be reached by evolution is increasingly restricted by higher mutation rates. In the case of such severe mutation

  6. New Approaches to Coding Information using Inverse Scattering Transform

    Science.gov (United States)

    Frumin, L. L.; Gelash, A. A.; Turitsyn, S. K.

    2017-06-01

    Remarkable mathematical properties of the integrable nonlinear Schrödinger equation (NLSE) can offer advanced solutions for the mitigation of nonlinear signal distortions in optical fiber links. Fundamental optical soliton, continuous, and discrete eigenvalues of the nonlinear spectrum have already been considered for the transmission of information in fiber-optic channels. Here, we propose to apply signal modulation to the kernel of the Gelfand-Levitan-Marchenko equations that offers the advantage of a relatively simple decoder design. First, we describe an approach based on exploiting the general N -soliton solution of the NLSE for simultaneous coding of N symbols involving 4 ×N coding parameters. As a specific elegant subclass of the general schemes, we introduce a soliton orthogonal frequency division multiplexing (SOFDM) method. This method is based on the choice of identical imaginary parts of the N -soliton solution eigenvalues, corresponding to equidistant soliton frequencies, making it similar to the conventional OFDM scheme, thus, allowing for the use of the efficient fast Fourier transform algorithm to recover the data. Then, we demonstrate how to use this new approach to control signal parameters in the case of the continuous spectrum.

  7. Informed consent in human experimentation before the Nuremberg code.

    Science.gov (United States)

    Vollmann, J; Winau, R

    1996-12-07

    The issue of ethics with respect to medical experimentation in Germany during the 1930s and 1940s was crucial at the Nuremberg trials and related trials of doctors and public health officials. Those involved in horrible crimes attempted to excuse themselves by arguing that there were no explicit rules governing medical research on human beings in Germany during the period and that research practices in Germany were not different from those in allied countries. In this context the Nuremberg code of 1947 is generally regarded as the first document to set out ethical regulations in human experimentation based on informed consent. New research, however, indicates that ethical issues of informed consent in guidelines for human experimentation were recognised as early as the nineteenth century. These guidelines shed light on the still contentious issue of when the concepts of autonomy, informed consent, and therapeutic and non-therapeutic research first emerged. This issue assumes renewed importance in the context of current attempts to assess liability and responsibility for the abuse of people in various experiments conducted since the second world war in the United States, Canada, Russia, and other nations.

  8. Time and category information in pattern-based codes

    Directory of Open Access Journals (Sweden)

    Hugo G Eyherabide

    2010-11-01

    Full Text Available Sensory stimuli are usually composed of different features (the "what" appearing at irregular times (the "when". Neural responses often use spike patterns to represent sensory information. The "what" is hypothesised to be encoded in the identity of the elicited patterns (the pattern categories, and the "when", in the time positions of patterns (the pattern timing. However, this standard view is oversimplified. In the real world, the "what" and the "when" might not be separable concepts, for instance, if they are correlated in the stimulus. In addition, neuronal dynamics can condition the pattern timing to be correlated with the pattern categories. Hence, timing and categories of patterns may not constitute independent channels of information. In this paper, we assess the role of spike patterns in the neural code, irrespective of the nature of the patterns. We first define information-theoretical quantities that allow us to quantify the information encoded by different aspects of the neural response. We also introduce the notion of synergy/redundancy between time positions and categories of patterns. We subsequently establish the relation between the "what" and the "when" in the stimulus with the timing and the categories of patterns. To that aim, we quantify the mutual information between different aspects of the stimulus and different aspects of the response. This formal framework allows us to determine the precise conditions under which the standard view holds, as well as the departures from this simple case. Finally, we study the capability of different response aspects to represent the "what" and the "when" in the neural response.

  9. Methodology for coding the energy emergency management information system. [Facility ID's and energy codes

    Energy Technology Data Exchange (ETDEWEB)

    D' Acierno, J.; Hermelee, A.; Fredrickson, C.P.; Van Valkenburg, K.

    1979-11-01

    The coding methodology for creating facility ID's and energy codes from information existing in EIA data systems currently being mapped into the EEMIS data structure is presented. A comprehensive approach is taken to facilitate implementation of EEMIS. A summary of EIA data sources which will be a part of the final system is presented in a table showing the intersection of 19 EIA data systems with the EEMIS data structure. The methodology for establishing ID codes for EIA sources and the corresponding EEMIS facilities in this table is presented. Detailed energy code translations from EIA source systems to the EEMIS energy codes are provided in order to clarify the transfer of energy data from many EIA systems which use different coding schemes. 28 tables.

  10. A colorful origin for the genetic code: information theory, statistical mechanics and the emergence of molecular codes.

    Science.gov (United States)

    Tlusty, Tsvi

    2010-09-01

    The genetic code maps the sixty-four nucleotide triplets (codons) to twenty amino-acids. While the biochemical details of this code were unraveled long ago, its origin is still obscure. We review information-theoretic approaches to the problem of the code's origin and discuss the results of a recent work that treats the code in terms of an evolving, error-prone information channel. Our model - which utilizes the rate-distortion theory of noisy communication channels - suggests that the genetic code originated as a result of the interplay of the three conflicting evolutionary forces: the needs for diverse amino-acids, for error-tolerance and for minimal cost of resources. The description of the code as an information channel allows us to mathematically identify the fitness of the code and locate its emergence at a second-order phase transition when the mapping of codons to amino-acids becomes nonrandom. The noise in the channel brings about an error-graph, in which edges connect codons that are likely to be confused. The emergence of the code is governed by the topology of the error-graph, which determines the lowest modes of the graph-Laplacian and is related to the map coloring problem. (c) 2010 Elsevier B.V. All rights reserved.

  11. Factors influencing DOC leaching from terrestrial ecosystems: a database analysis

    Science.gov (United States)

    Camino Serrano, M.; Janssens, I.; Luyssaert, S.; Ciais, P.; Gielen, B.

    2012-04-01

    The lateral transport of dissolved organic carbon (DOC) is an important process linking terrestrial and aquatic ecosystems. Neglecting these fluxes can lead to biased of eddy covariance-based estimates of terrestrial ecosystem carbon sequestration. The necessity for integrating DOC leaching in carbon cycle models is thus clear, especially in view of future model development aiming at directly linking terrestrial, freshwater and ocean carbon cycles. However, to achieve this goal, more accurate information is needed in order to better understand and predict dissolved organic carbon dynamics. DOC concentrations mainly vary by geographical location, soil and vegetation types, topography, season and climate. Within this framework, we developed a database on DOC concentrations and fluxes with the aim of better understanding how those parameters determine DOC variations. This database compiles DOC concentrations and fluxes in soil solution and creeks at site or catchment level for different ecosystems around the world, but with special focus on the Northern Hemisphere and on peatland ecosystems. The database currently includes information from around 120 sites, gathered from published literature and datasets accessible on the internet. The database contains annual, seasonal and monthly data on DOC, dissolved inorganic carbon (DIC), dissolved organic nitrogen (DON) and dissolved inorganic nitrogen (DIN) and also includes other meta-data related to the site, such as land cover, soil properties, climate, annual water balance and other soil solution parameters. This compiled dataset allows to study the influence of several physical factors that determine DOC production in soils. We will present the observed relationships between drivers, such as precipitation, drainage flows, soil pH, soil texture, and DOC concentration/ DOC fluxes at different levels, ecosystem types, temporal scales (monthly versus annual or seasonal), and soil depths. The same relations will be analysed

  12. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  13. The Technique of Binary Code Decompilation and Its Application in Information Security Sphere

    Directory of Open Access Journals (Sweden)

    M. O. Shudrak

    2012-12-01

    Full Text Available The authors describes a new technique of binary code decompilation and its application possibility in information security such as software protection against reverse engineering and code obfuscation analyze in malware.

  14. supp27.doc

    Indian Academy of Sciences (India)

    Supplementary Information. Table S1 The geometric parameters of the title compound with B3LYP/6-31G** method, Bond Lengths (Å) and Bond Angles (°) ... Table S2 Calculated HOMO and LUMO Energies (au) and Energy Gaps of the title compound at B3LYP/6-31G** level ...

  15. supp16.doc

    Indian Academy of Sciences (India)

    SUPPLEMENTARY INFORMATION. Effect of external electric field on Cyclodextrin-Alcohol adducts: A DFT study. KUNDAN BARUAH and PRADIP KR. BHATTACHARYYA*. Department of Chemistry, Arya Vidyapeeth College, Guwahati 781 016, Assam, India. E-mail address: prdpbhatta@yahoo.com. Supplement Table S1: ...

  16. Publishing datasets with eSciDoc and panMetaDocs

    Science.gov (United States)

    Ulbricht, D.; Klump, J.; Bertelmann, R.

    2012-04-01

    Currently serveral research institutions worldwide undertake considerable efforts to have their scientific datasets published and to syndicate them to data portals as extensively described objects identified by a persistent identifier. This is done to foster the reuse of data, to make scientific work more transparent, and to create a citable entity that can be referenced unambigously in written publications. GFZ Potsdam established a publishing workflow for file based research datasets. Key software components are an eSciDoc infrastructure [1] and multiple instances of the data curation tool panMetaDocs [2]. The eSciDoc repository holds data objects and their associated metadata in container objects, called eSciDoc items. A key metadata element in this context is the publication status of the referenced data set. PanMetaDocs, which is based on PanMetaWorks [3], is a PHP based web application that allows to describe data with any XML-based metadata schema. The metadata fields can be filled with static or dynamic content to reduce the number of fields that require manual entries to a minimum and make use of contextual information in a project setting. Access rights can be applied to set visibility of datasets to other project members and allow collaboration on and notifying about datasets (RSS) and interaction with the internal messaging system, that was inherited from panMetaWorks. When a dataset is to be published, panMetaDocs allows to change the publication status of the eSciDoc item from status "private" to "submitted" and prepare the dataset for verification by an external reviewer. After quality checks, the item publication status can be changed to "published". This makes the data and metadata available through the internet worldwide. PanMetaDocs is developed as an eSciDoc application. It is an easy to use graphical user interface to eSciDoc items, their data and metadata. It is also an application supporting a DOI publication agent during the process of

  17. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  18. AR DOC: Augmented reality documentaries

    DEFF Research Database (Denmark)

    Vistisen, Peter

    2014-01-01

    Augmented Reality Documentaries (AR DOC) er et ’lille’ Shareplay projekt (ansøgte midler augmented reality cross media løsninger, til at skabe engagerende publikumsformidling...... indenfor oplevelsesindustrien. Projektet har genereret ny viden omkring, hvordan fysisk og digital formidling kan understøttes via Augmented Reality som formidlingsformat....

  19. Order information coding in working memory: Review of behavioural studies and cognitive mechanisms

    Directory of Open Access Journals (Sweden)

    Barbara Dolenc

    2014-06-01

    Full Text Available Executive processes, such as coding for sequential order, are of extreme importance for higher-order cognitive tasks. One of the significant questions is, how order information is coded in working memory and what cognitive mechanisms and processes mediate it. The aim of this review paper is to summarize results of studies that explore whether order and item memory are two separable processes. Furthermore, we reviewed evidence for each of the proposed cognitive mechanism that might mediate order processing. Previous behavioural and neuroimaging data suggest different representation and processing of item and order information in working memory. Both information are maintained and recalled separately and this separation seems to hold for recognition as well as for recall. To explain the result of studies of order coding, numerous cognitive mechanisms were proposed. We focused on four different mechanisms by which order information might be coded and retrieved, namely inter-item associations, direct coding, hierarchical coding and magnitude coding. Each of the mechanisms can explain some of the aspect of order information coding, however none of them is able to explain all of the empirical findings. Due to its complex nature it is not surprising that a single mechanism has difficulties accounting for all the behavioral data and order memory may be more accurately characterized as the result of a set of mechanisms rather than a single one. Moreover, the findings beget a question of whether different types of memory for order information might exist.

  20. Stochasticity in Ca2+ increase in spines enables robust and sensitive information coding.

    Directory of Open Access Journals (Sweden)

    Takuya Koumura

    Full Text Available A dendritic spine is a very small structure (∼0.1 µm3 of a neuron that processes input timing information. Why are spines so small? Here, we provide functional reasons; the size of spines is optimal for information coding. Spines code input timing information by the probability of Ca2+ increases, which makes robust and sensitive information coding possible. We created a stochastic simulation model of input timing-dependent Ca2+ increases in a cerebellar Purkinje cell's spine. Spines used probability coding of Ca2+ increases rather than amplitude coding for input timing detection via stochastic facilitation by utilizing the small number of molecules in a spine volume, where information per volume appeared optimal. Probability coding of Ca2+ increases in a spine volume was more robust against input fluctuation and more sensitive to input numbers than amplitude coding of Ca2+ increases in a cell volume. Thus, stochasticity is a strategy by which neurons robustly and sensitively code information.

  1. Transform domain Wyner-Ziv video coding with refinement of noise residue and side information

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2010-01-01

    are successively updating the estimated noise residue for noise modeling and side information frame quality during decoding. Experimental results show that the proposed decoder can improve the Rate- Distortion (RD) performance of a state-of-the-art Wyner Ziv video codec for the set of test sequences.......Distributed Video Coding (DVC) is a video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of side information at the decoder. This paper considers feedback channel based Transform Domain Wyner-Ziv (TDWZ) DVC. The coding efficiency of TDWZ video...... coding does not match that of conventional video coding yet, mainly due to the quality of side information and inaccurate noise estimation. In this context, a novel TDWZ video decoder with noise residue refinement (NRR) and side information refinement (SIR) is proposed. The proposed refinement schemes...

  2. Optimal Index Codes for a Class of Multicast Networks with Receiver Side Information

    CERN Document Server

    Ong, Lawrence

    2012-01-01

    This paper studies a special class of multicast index coding problems where a sender transmits messages to multiple receivers, each with some side information. Here, each receiver knows a unique message a priori, and there is no restriction on how many messages each receiver requests from the sender. For this class of multicast index coding problems, we obtain the optimal index code, which has the shortest codelength for which the sender needs to send in order for all receivers to obtain their (respective) requested messages. This is the first class of index coding problems where the optimal index codes are found. In addition, linear index codes are shown to be optimal for this class of index coding problems.

  3. The Earth System Documentation (ES-DOC) Software Process

    Science.gov (United States)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  4. Information rates and power spectra of digital codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1982-01-01

    is expressed in terms of the rate distortion function for a memoryless finite alphabet source and mean-square error distortion measure. A class of simple dc-free power spectra is considered in detail, and a method for constructing Markov sources with such spectra is derived. It is found that these sequences......The encoding of independent data symbols as a sequence of discrete amplitude, real variables with given power spectrum is considered. The maximum rate of such an encoding is determined by the achievable entropy of the discrete sequence with the given constraints. An upper bound to this entropy...... have greater entropies than most codes with similar spectra that have been suggested earlier, and that they often come close to the upper bound. When the constraint on the power spectrum is replaced by a constraint On the variance of the sum of the encoded symbols, a stronger upper bound to the rate...

  5. Rethinking mobile delivery: using Quick Response codes to access information at the point of need.

    Science.gov (United States)

    Lombardo, Nancy T; Morrow, Anne; Le Ber, Jeanne

    2012-01-01

    This article covers the use of Quick Response (QR) codes to provide instant mobile access to information, digital collections, educational offerings, library website, subject guides, text messages, videos, and library personnel. The array of uses and the value of using QR codes to push customized information to patrons are explained. A case is developed for using QR codes for mobile delivery of customized information to patrons. Applications in use at the Libraries of the University of Utah will be reviewed to provide readers with ideas for use in their library. Copyright © Taylor & Francis Group, LLC

  6. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    Science.gov (United States)

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  7. Shannon Information and Power Law Analysis of the Chromosome Code

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2012-01-01

    Full Text Available This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.

  8. Change to an informal interview dress code improves residency applicant perceptions

    National Research Council Canada - National Science Library

    Hern, Jr, H Gene; Wills, Charlotte P; Johnson, Brian

    Residency interview apparel has traditionally been the dark business suit. We changed the interview dress code from a traditionally established unwritten 'formal' attire to an explicitly described 'informal' attire...

  9. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...... information for (LDPCA) decoding. The proposed decoder side techniques for side information and noise learning (SING) are integrated in a TDWZ scheme. On test sequences, the proposed SING codec robustly improves the coding efficiency of TDWZ DVC. For WZ frames using a GOP size of 2, up to 4dB improvement...

  10. Error Correcting Coding of Telemetry Information for Channel with Random Bit Inversions and Deletions

    Directory of Open Access Journals (Sweden)

    M. A. Elshafey

    2014-01-01

    Full Text Available This paper presents a method of error-correcting coding of digital information. Feature of this method is the treatment of cases of inversion and skip bits caused by a violation of the synchronization of the receiving and transmitting device or other factors. The article gives a brief overview of the features, characteristics, and modern methods of construction LDPC and convolutional codes, as well as considers a general model of the communication channel, taking into account the probability of bits inversion, deletion and insertion. The proposed coding scheme is based on a combination of LDPC coding and convolution coding. A comparative analysis of the proposed combined coding scheme and a coding scheme containing only LDPC coder is performed. Both of the two schemes have the same coding rate. Experiments were carried out on two models of communication channels at different probability values of bit inversion and deletion. The first model allows only random bit inversion, while the other allows both random bit inversion and deletion. In the experiments research and analysis of the delay decoding of convolutional coder is performed and the results of these experimental studies demonstrate the feasibility of planted coding scheme to improve the efficiency of data recovery that is transmitted over a communication channel with noises which allow random bit inversion and deletion without decreasing the coding rate.

  11. Integration of QR codes into an anesthesia information management system for resident case log management.

    Science.gov (United States)

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Investigation of Carbon Nanotubes Using the F-Term Code of Japanese Patent Information

    OpenAIRE

    Chen-Yuan Liu; Shenq-Yih Luo

    2007-01-01

    Patents contain much novel technological information. In this paper, the searching methods of the file index (FI) and F-term classification system developed by the Japan Patent Office (JPO) were employed to find patents containing information on carbon nanotube technology. All related patent data were searched for in the Intellectual Property Digital Library (IPDL). Moreover, using theme codes and term codes in the two-dimensional structure of the F-term list, we investigated and analyzed the...

  13. Coding With Action-dependent Side Information and Additional Reconstruction Requirements

    CERN Document Server

    Kittichokechai, Kittipong; Skoglund, Mikael

    2012-01-01

    Constrained lossy source coding and channel coding with side information problems which extend the classic Wyner-Ziv and Gel'fand-Pinsker problems are considered. Inspired by applications in sensor networking and control, we first consider lossy source coding with two-sided partial side information where the quality/availability of the side information can be influenced by a cost-constrained action sequence. A decoder reconstructs a source sequence subject to the distortion constraint, and at the same time, an encoder is additionally required to be able to estimate the decoder's reconstruction. Next, we consider the channel coding "dual" where the channel state is assumed to depend on the action sequence, and the decoder is required to decode both the transmitted message and channel input reliably. Implications on the fundamental limits of communication in discrete memoryless systems due to the additional reconstruction constraints are investigated. Single-letter expressions for the rate-distortion-cost funct...

  14. Information-Dispersion-Entropy-Based Blind Recognition of Binary BCH Codes in Soft Decision Situations

    Directory of Open Access Journals (Sweden)

    Yimeng Zhang

    2013-05-01

    Full Text Available A method of blind recognition of the coding parameters for binary Bose-Chaudhuri-Hocquenghem (BCH codes is proposed in this paper. We consider an intelligent communication receiver which can blindly recognize the coding parameters of the received data stream. The only knowledge is that the stream is encoded using binary BCH codes, while the coding parameters are unknown. The problem can be addressed on the context of the non-cooperative communications or adaptive coding and modulations (ACM for cognitive radio networks. The recognition processing includes two major procedures: code length estimation and generator polynomial reconstruction. A hard decision method has been proposed in a previous literature. In this paper we propose the recognition approach in soft decision situations with Binary-Phase-Shift-Key modulations and Additive-White-Gaussian-Noise (AWGN channels. The code length is estimated by maximizing the root information dispersion entropy function. And then we search for the code roots to reconstruct the primitive and generator polynomials. By utilizing the soft output of the channel, the recognition performance is improved and the simulations show the efficiency of the proposed algorithm.

  15. Frames, designs, and spherical codes in quantum information theory

    Science.gov (United States)

    Renes, Joseph M.

    Frame theory offers a lens through which to view a large portion of quantum information theory, providing an organizational principle to those topics in its purview. In this thesis, I cut a trail from foundational questions to practical applications, from the origin of the quantum probability rule to quantum cryptography, by way of a standard quantum measurement helpful in quantum tomography and representation of quantum theory. Before embarking, preparations are undertaken by outlining the relevant aspects of frame theory, particularly the characterization of generalized orthonormal bases in terms of physical quantum measurements, as well as several aesthetically appealing families of measurements, each possessing a high degree of symmetry. Much more than just elegant, though, these quantum measurements are found to be useful in many aspects of quantum information theory. I first consider the foundational question of justifying the quantum probability rule, showing that putting a probability valuation on generalized quantum measurements leads directly to the Born rule. Moreover, for qubits, the case neglected in the traditional formulation of Gleason's theorem, a symmetric three-outcome measurement called the trine is sufficient to impel the desired form. Keeping with foundational questions, I then turn to the problem of establishing a symmetric measurement capable of effortlessly rendering quantum theory in terms of classical probability theory. Numerical results provide an almost utterly convincing amount of evidence for this, justifying the subsequent study of its use in quantum tomography and detailed account of the properties of the reduction to probabilistic terms. Saving perhaps the most exciting topic for last, I make use of these aesthetic ensembles in the applied field of quantum cryptography. A large class of streamlined key distribution protocols may be cut from the cloth of these ensembles, and their symmetry affords them improved tolerance to

  16. Using QR codes to enable quick access to information in acute cancer care.

    Science.gov (United States)

    Upton, Joanne; Olsson-Brown, Anna; Marshall, Ernie; Sacco, Joseph

    2017-05-25

    Quick access to toxicity management information ensures timely access to steroids/immunosuppressive treatment for cancer patients experiencing immune-related adverse events, thus reducing length of hospital stays or avoiding hospital admission entirely. This article discusses a project to add a QR (quick response) code to a patient-held immunotherapy alert card. As QR code generation is free and the immunotherapy clinical management algorithms were already publicly available through the trust's clinical network website, the costs of integrating a QR code into the alert card, after printing, were low, while the potential benefits are numerous. Patient-held alert cards are widely used for patients receiving anti-cancer treatment, and this established standard of care has been modified to enable rapid access of information through the incorporation of a QR code.

  17. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    Science.gov (United States)

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  18. Changes among Israeli Youth Movements: A Structural Analysis Based on Kahane's Code of Informality

    Science.gov (United States)

    Cohen, Erik H.

    2015-01-01

    Multi-dimensional data analysis tools are applied to Reuven Kahane's data on the informality of youth organizations, yielding a graphic portrayal of Kahane's code of informality. This structure helps address questions of the whether the eight structural components exhaustively cover the field without redundancy. Further, the structure is used to…

  19. Optical encryption and QR codes: secure and noise-free information retrieval.

    Science.gov (United States)

    Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto

    2013-03-11

    We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.

  20. What Information is Stored in DNA: Does it Contain Digital Error Correcting Codes?

    Science.gov (United States)

    Liebovitch, Larry

    1998-03-01

    The longest term correlations in living systems are the information stored in DNA which reflects the evolutionary history of an organism. The 4 bases (A,T,G,C) encode sequences of amino acids as well as locations of binding sites for proteins that regulate DNA. The fidelity of this important information is maintained by ANALOG error check mechanisms. When a single strand of DNA is replicated the complementary base is inserted in the new strand. Sometimes the wrong base is inserted that sticks out disrupting the phosphate backbone. The new base is not yet methylated, so repair enzymes, that slide along the DNA, can tear out the wrong base and replace it with the right one. The bases in DNA form a sequence of 4 different symbols and so the information is encoded in a DIGITAL form. All the digital codes in our society (ISBN book numbers, UPC product codes, bank account numbers, airline ticket numbers) use error checking code, where some digits are functions of other digits to maintain the fidelity of transmitted informaiton. Does DNA also utitlize a DIGITAL error chekcing code to maintain the fidelity of its information and increase the accuracy of replication? That is, are some bases in DNA functions of other bases upstream or downstream? This raises the interesting mathematical problem: How does one determine whether some symbols in a sequence of symbols are a function of other symbols. It also bears on the issue of determining algorithmic complexity: What is the function that generates the shortest algorithm for reproducing the symbol sequence. The error checking codes most used in our technology are linear block codes. We developed an efficient method to test for the presence of such codes in DNA. We coded the 4 bases as (0,1,2,3) and used Gaussian elimination, modified for modulus 4, to test if some bases are linear combinations of other bases. We used this method to analyze the base sequence in the genes from the lac operon and cytochrome C. We did not find

  1. Investigation of Carbon Nanotubes Using the F-Term Code of Japanese Patent Information

    Directory of Open Access Journals (Sweden)

    Chen-Yuan Liu

    2007-05-01

    Full Text Available Patents contain much novel technological information. In this paper, the searching methods of the file index (FI and F-term classification system developed by the Japan Patent Office (JPO were employed to find patents containing information on carbon nanotube technology. All related patent data were searched for in the Intellectual Property Digital Library (IPDL. Moreover, using theme codes and term codes in the two-dimensional structure of the F-term list, we investigated and analyzed the technical features expressed by carbon nanotubes in related documents in Boolean operations.

  2. The use of ICF codes for information retrieval in rehabilitation research: an empirical study.

    Science.gov (United States)

    Sundar, Vidyalakshmi; Daumen, Marcia E; Conley, Daniel J; Stone, John H

    2008-01-01

    Rehabilitation research information can be obtained from various bibliographic sources. Nevertheless, search strategies and terminologies differ from one database to another making it challenging for the novice user or users of multiple databases. This paper discusses a novel approach of using the International Classification of Functioning, Disability and Health (ICF) codes to retrieve rehabilitation research information. A crosswalk was created by mapping the Center for International Rehabilitation Research and Information Exchange's (CIRRIE) subject headings to the two-level ICF codes and a search interface was developed (available at: http://cirrie.buffalo.edu/icf/crosswalk.php) so that users can input ICF codes instead of conventional subject headings. About 62% of all CIRRIE subject headings were mapped to equivalent ICF codes. Among the CIRRIE subject heading that were mapped, 43% were mapped to the Environmental Factors, followed by 34% mapped to the Activities and Participation component of the ICF. Although the ICF was not conceived or developed as a system of formal terminology, it can be used effectively for information retrieval in conjunction with an existing vocabulary. This paper describes the first attempt in implementing the use of ICF for information retrieval.

  3. Optical information encryption based on incoherent superposition with the help of the QR code

    Science.gov (United States)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  4. Neural Code-Neural Self-information Theory on How Cell-Assembly Code Rises from Spike Time and Neuronal Variability.

    Science.gov (United States)

    Li, Meng; Tsien, Joe Z

    2017-01-01

    A major stumbling block to cracking the real-time neural code is neuronal variability - neurons discharge spikes with enormous variability not only across trials within the same experiments but also in resting states. Such variability is widely regarded as a noise which is often deliberately averaged out during data analyses. In contrast to such a dogma, we put forth the Neural Self-Information Theory that neural coding is operated based on the self-information principle under which variability in the time durations of inter-spike-intervals (ISI), or neuronal silence durations, is self-tagged with discrete information. As the self-information processor, each ISI carries a certain amount of information based on its variability-probability distribution; higher-probability ISIs which reflect the balanced excitation-inhibition ground state convey minimal information, whereas lower-probability ISIs which signify rare-occurrence surprisals in the form of extremely transient or prolonged silence carry most information. These variable silence durations are naturally coupled with intracellular biochemical cascades, energy equilibrium and dynamic regulation of protein and gene expression levels. As such, this silence variability-based self-information code is completely intrinsic to the neurons themselves, with no need for outside observers to set any reference point as typically used in the rate code, population code and temporal code models. Moreover, temporally coordinated ISI surprisals across cell population can inherently give rise to robust real-time cell-assembly codes which can be readily sensed by the downstream neural clique assemblies. One immediate utility of this self-information code is a general decoding strategy to uncover a variety of cell-assembly patterns underlying external and internal categorical or continuous variables in an unbiased manner.

  5. Codes, Costs, and Critiques: The Organization of Information in "Library Quarterly", 1931-2004

    Science.gov (United States)

    Olson, Hope A.

    2006-01-01

    This article reports the results of a quantitative and thematic content analysis of the organization of information literature in the "Library Quarterly" ("LQ") between its inception in 1931 and 2004. The majority of articles in this category were published in the first half of "LQ's" run. Prominent themes have included cataloging codes and the…

  6. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    Energy Technology Data Exchange (ETDEWEB)

    Blyth, Taylor S. [Pennsylvania State Univ., University Park, PA (United States); Avramova, Maria [North Carolina State Univ., Raleigh, NC (United States)

    2017-04-01

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.

  7. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    Science.gov (United States)

    Blyth, Taylor S.

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.

  8. The Earth System Documentation (ES-DOC) project

    Science.gov (United States)

    Murphy, S.; Greenslade, M. A.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high quality tools and services in support of Earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation ecosystem that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system. Within this context ES-DOC leverages the emerging Common Information Model (CIM) metadata standard, which has supported the following projects: ** Coupled Model Inter-comparison Project Phase 5 (CMIP5); ** Dynamical Core Model Inter-comparison Project (DCMIP-2012); ** National Climate Predictions and Projections Platforms (NCPP) Quantitative Evaluation of Downscaling Workshop (QED-2013). This presentation will introduce the project to a wider audience and will demonstrate the current production level capabilities of the eco-system: ** An ESM documentation Viewer embeddable into any website; ** An ESM Questionnaire configurable on a project by project basis; ** An ESM comparison tool reusable across projects; ** An ESM visualization tool reusable across projects; ** A search engine for speedily accessing published documentation; ** Libraries for streamlining document creation, validation and publishing pipelines.

  9. Many Primary Care Docs May Miss Prediabetes

    Science.gov (United States)

    ... https://medlineplus.gov/news/fullstory_167370.html Many Primary Care Docs May Miss Prediabetes Fewer than 1 in ... 2017 MONDAY, July 24, 2017 (HealthDay News) -- Most primary care doctors can't identify all 11 risk factors ...

  10. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  11. Insights and issues with simulating terrestrial DOC loading of Arctic river networks

    Science.gov (United States)

    Kicklighter, David W.; Hayes, Daniel J.; McClelland, James W.; Peterson, Bruce J.; McGuire, A. David; Melillo, Jerry M.

    2013-01-01

    Terrestrial carbon dynamics influence the contribution of dissolved organic carbon (DOC) to river networks in addition to hydrology. In this study, we use a biogeochemical process model to simulate the lateral transfer of DOC from land to the Arctic Ocean via riverine transport. We estimate that, over the 20th century, the pan-Arctic watershed has contributed, on average, 32 Tg C/yr of DOC to river networks emptying into the Arctic Ocean with most of the DOC coming from the extensive area of boreal deciduous needle-leaved forests and forested wetlands in Eurasian watersheds. We also estimate that the rate of terrestrial DOC loading has been increasing by 0.037 Tg C/yr2 over the 20th century primarily as a result of climate-induced increases in water yield. These increases have been offset by decreases in terrestrial DOC loading caused by wildfires. Other environmental factors (CO2 fertilization, ozone pollution, atmospheric nitrogen deposition, timber harvest, agriculture) are estimated to have relatively small effects on terrestrial DOC loading to Arctic rivers. The effects of the various environmental factors on terrestrial carbon dynamics have both offset and enhanced concurrent effects on hydrology to influence terrestrial DOC loading and may be changing the relative importance of terrestrial carbon dynamics on this carbon flux. Improvements in simulating terrestrial DOC loading to pan-Arctic rivers in the future will require better information on the production and consumption of DOC within the soil profile, the transfer of DOC from land to headwater streams, the spatial distribution of precipitation and its temporal trends, carbon dynamics of larch-dominated ecosystems in eastern Siberia, and the role of industrial organic effluents on carbon budgets of rivers in western Russia.

  12. Uncertainty Quantification and Learning in Geophysical Modeling: How Information is Coded into Dynamical Models

    Science.gov (United States)

    Gupta, H. V.

    2014-12-01

    There is a clear need for comprehensive quantification of simulation uncertainty when using geophysical models to support and inform decision-making. Further, it is clear that the nature of such uncertainty depends on the quality of information in (a) the forcing data (driver information), (b) the model code (prior information), and (c) the specific values of inferred model components that localize the model to the system of interest (inferred information). Of course, the relative quality of each varies with geophysical discipline and specific application. In this talk I will discuss a structured approach to characterizing how 'Information', and hence 'Uncertainty', is coded into the structures of physics-based geophysical models. I propose that a better understanding of what is meant by "Information", and how it is embodied in models and data, can offer a structured (less ad-hoc), robust and insightful basis for diagnostic learning through the model-data juxtaposition. In some fields, a natural consequence may be to emphasize the a priori role of System Architecture (Process Modeling) over that of the selection of System Parameterization, thereby emphasizing the more creative aspect of scientific investigation - the use of models for Discovery and Learning.

  13. Genetic Code Evolution Reveals the Neutral Emergence of Mutational Robustness, and Information as an Evolutionary Constraint

    Directory of Open Access Journals (Sweden)

    Steven E. Massey

    2015-04-01

    Full Text Available The standard genetic code (SGC is central to molecular biology and its origin and evolution is a fundamental problem in evolutionary biology, the elucidation of which promises to reveal much about the origins of life. In addition, we propose that study of its origin can also reveal some fundamental and generalizable insights into mechanisms of molecular evolution, utilizing concepts from complexity theory. The first is that beneficial traits may arise by non-adaptive processes, via a process of “neutral emergence”. The structure of the SGC is optimized for the property of error minimization, which reduces the deleterious impact of point mutations. Via simulation, it can be shown that genetic codes with error minimization superior to the SGC can emerge in a neutral fashion simply by a process of genetic code expansion via tRNA and aminoacyl-tRNA synthetase duplication, whereby similar amino acids are added to codons related to that of the parent amino acid. This process of neutral emergence has implications beyond that of the genetic code, as it suggests that not all beneficial traits have arisen by the direct action of natural selection; we term these “pseudaptations”, and discuss a range of potential examples. Secondly, consideration of genetic code deviations (codon reassignments reveals that these are mostly associated with a reduction in proteome size. This code malleability implies the existence of a proteomic constraint on the genetic code, proportional to the size of the proteome (P, and that its reduction in size leads to an “unfreezing” of the codon – amino acid mapping that defines the genetic code, consistent with Crick’s Frozen Accident theory. The concept of a proteomic constraint may be extended to propose a general informational constraint on genetic fidelity, which may be used to explain variously, differences in mutation rates in genomes with differing proteome sizes, differences in DNA repair capacity and genome

  14. Mutual information of sparsely coded associative memory with self-control and ternary neurons.

    Science.gov (United States)

    Bollé, D; Dominguez, D R; Amari, S

    2000-01-01

    The influence of a macroscopic time-dependent threshold on the retrieval dynamics of attractor associative memory models with ternary neurons ¿-1, 0, +1¿ is examined. If the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the memorized patterns in the model, adapting itself in the course of the time evolution, it guarantees an autonomous functioning of the model. Especially in the limit of sparse coding, it is found that this self-control mechanism considerably improves the quality of the fixed-point retrieval dynamics, in particular the storage capacity, the basins of attraction and the information content. The mutual information is shown to be the relevant parameter to study the retrieval quality of such sparsely coded models. Numerical results confirm these observations.

  15. Construction of Rate-Compatible LDPC Codes Utilizing Information Shortening and Parity Puncturing

    Directory of Open Access Journals (Sweden)

    Jones Christopher R

    2005-01-01

    Full Text Available This paper proposes a method for constructing rate-compatible low-density parity-check (LDPC codes. The construction considers the problem of optimizing a family of rate-compatible degree distributions as well as the placement of bipartite graph edges. A hybrid approach that combines information shortening and parity puncturing is proposed. Local graph conditioning techniques for the suppression of error floors are also included in the construction methodology.

  16. Sensorineural hearing loss amplifies neural coding of envelope information in the central auditory system of chinchillas.

    Science.gov (United States)

    Zhong, Ziwei; Henry, Kenneth S; Heinz, Michael G

    2014-03-01

    People with sensorineural hearing loss often have substantial difficulty understanding speech under challenging listening conditions. Behavioral studies suggest that reduced sensitivity to the temporal structure of sound may be responsible, but underlying neurophysiological pathologies are incompletely understood. Here, we investigate the effects of noise-induced hearing loss on coding of envelope (ENV) structure in the central auditory system of anesthetized chinchillas. ENV coding was evaluated noninvasively using auditory evoked potentials recorded from the scalp surface in response to sinusoidally amplitude modulated tones with carrier frequencies of 1, 2, 4, and 8 kHz and a modulation frequency of 140 Hz. Stimuli were presented in quiet and in three levels of white background noise. The latency of scalp-recorded ENV responses was consistent with generation in the auditory midbrain. Hearing loss amplified neural coding of ENV at carrier frequencies of 2 kHz and above. This result may reflect enhanced ENV coding from the periphery and/or an increase in the gain of central auditory neurons. In contrast to expectations, hearing loss was not associated with a stronger adverse effect of increasing masker intensity on ENV coding. The exaggerated neural representation of ENV information shown here at the level of the auditory midbrain helps to explain previous findings of enhanced sensitivity to amplitude modulation in people with hearing loss under some conditions. Furthermore, amplified ENV coding may potentially contribute to speech perception problems in people with cochlear hearing loss by acting as a distraction from more salient acoustic cues, particularly in fluctuating backgrounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Finding hot-spots and hot- moments of DOC concentrations in two streams feeding a drinking water reservoir

    Science.gov (United States)

    Oosterwoud, Marieke; Musolff, Andreas; Fleckenstein, Jan H.

    2014-05-01

    The flux of dissolved organic carbon (DOC) derived from soils is a significant term in terrestrial carbon budgets and as a result a dominant link between terrestrial and aquatic ecosystems. Since surface waters are a main source of drinking water, increasing DOC concentrations are a cause of concern across Europe. As they can transport contaminants and negatively affect drinking water treatment processes. Downstream DOC concentrations are the sum of headwater inputs, in combination with progressive downstream alterations by inflowing water with its own DOC concentrations. Better knowledge of spatial and temporal delivery of DOC in catchments is required to understand the mechanisms behind reported long term changes in DOC fluxes from soils to surface waters. The aim of this study is to identify where and when increased DOC concentrations occur within two catchments in the Harz-mountains, having different land-use, feeding a drinking water reservoir. The Hassel and Rappbode catchments are approximately equal in size. However, they differ in their land-use. The Hassel catchment has a considerable contribution of arable land compared to the Rappbode catchment which is mainly forested. We combined both standard synoptic sampling (biweekly) with high frequency UV-Vis analysis of DOC concentrations and its chemical composition in streams waters during one complete hydrological year. Through the synoptic sampling we obtain spatially detailed information about the (sub)catchments DOC export, whereas, the continuous UV-Vis measurements provided detailed information on the DOC-discharge behavior during different hydrological conditions. Results from the sampling and monitoring will be presented. We found DOC exports to be largest in the agricultural dominated catchment. However, temporal variability in DOC export was higher than the spatial variability within both catchments. We presume that it is likely that DOC exports are mainly driven by inputs from the riparian soils

  18. Color information processing (coding and synthesis) with fractional Fourier transforms and digital holography.

    Science.gov (United States)

    Chen, Linfei; Zhao, Daomu

    2007-11-26

    In this paper, we propose a new method for color image coding and synthesis based on fractional Fourier transforms and wavelength multiplexing with digital holography. A color image is divided into three channels and each channel, in which the information is encrypted with different wavelength, fractional orders and random phase masks, is independently encrypted or synthesized. The system parameters are additional keys and this method would improve the security of information encryption. The images are fused or subtracted by phase shifting technique. The possible optical implementations for color image encryption and synthesis are also proposed with some simulation results that show the possibility of the proposed idea.

  19. Software package as an information center product. [Activities of Argonne Code Center

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M. K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables. (RWR)

  20. Nonlinear and threshold-dominated runoff generation controls DOC export in a small peat catchment

    Science.gov (United States)

    Birkel, C.; Broder, T.; Biester, H.

    2017-03-01

    We used a relatively simple two-layer, coupled hydrology-biogeochemistry model to simultaneously simulate streamflow and stream dissolved organic carbon (DOC) concentrations in a small lead and arsenic contaminated upland peat catchment in northwestern Germany. The model procedure was informed by an initial data mining analysis, in combination with regression relationships of discharge, DOC, and element export. We assessed the internal model DOC processing based on stream DOC hysteresis patterns and 3-hourly time step groundwater level and soil DOC data for two consecutive summer periods in 2013 and 2014. The parsimonious model (i.e., few calibrated parameters) showed the importance of nonlinear and rapid near-surface runoff generation mechanisms that caused around 60% of simulated DOC load. The total load was high even though these pathways were only activated during storm events on average 30% of the monitoring time—as also shown by the experimental data. Overall, the drier period 2013 resulted in increased nonlinearity but exported less DOC (115 kg C ha-1 yr-1 ± 11 kg C ha-1 yr-1) compared to the equivalent but wetter period in 2014 (189 kg C ha-1 yr-1 ± 38 kg C ha-1 yr-1). The exceedance of a critical water table threshold (-10 cm) triggered a rapid near-surface runoff response with associated higher DOC transport connecting all available DOC pools and subsequent dilution. We conclude that the combination of detailed experimental work with relatively simple, coupled hydrology-biogeochemistry models not only allowed the model to be internally constrained but also provided important insight into how DOC and tightly coupled pollutants or trace elements are mobilized.

  1. Relative importance of multiple factors on terrestrial loading of DOC to Arctic river networks

    Energy Technology Data Exchange (ETDEWEB)

    Kicklighter, David W. [Ecosystem Center, The; Hayes, Daniel J [ORNL; Mcclelland, James W [University of Texas; Peterson, Bruce [Marine Biological Laboratory; Mcguire, David [University of Alaska; Melillo, Jerry [Marine Biological Laboratory

    2014-01-01

    Terrestrial carbon dynamics influence the contribution of dissolved organic carbon (DOC) to river networks in addition to controlling carbon fluxes between the land surface and the atmosphere. In this study, we use a biogeochemical process model to simulate the lateral transfer of DOC from land to the Arctic Ocean via riverine transport. We estimate that the pan-arctic watershed has contributed, on average, 32 Tg C/yr of DOC to the Arctic Ocean over the 20th century with most coming from the extensive area of boreal deciduous needle-leaved forests and forested wetlands in Eurasian watersheds. We also estimate that the rate of terrestrial DOC loading has been increasing by 0.037 Tg C/yr2 over the 20th century primarily as a result of increases in air temperatures and precipitation. These increases have been partially compensated by decreases in terrestrial DOC loading caused by wildfires. Other environmental factors (CO2 fertilization, ozone pollution, atmospheric nitrogen deposition, timber harvest, agriculture) are estimated to have relatively small effects on terrestrial DOC loading to arctic rivers. The effects of the various environmental factors on terrestrial carbon dynamics have both compensated and enhanced concurrent effects on hydrology to influence terrestrial DOC loading. Future increases in riverine DOC concentrations and export may occur from warming-induced increases in terrestrial DOC production associated with enhanced microbial metabolism and the exposure of additional organic matter from permafrost degradation along with decreases in water yield associated with warming-induced increases in evapotranspiration. Improvements in simulating terrestrial DOC loading to pan-arctic rivers in the future will require better information on the spatial distribution of precipitation and its temporal trends, carbon dynamics of larch-dominated ecosystems in eastern Siberia, and the role of industrial organic effluents on carbon budgets of rivers in western

  2. Satellite communications in Canada: A DOC perspective

    Science.gov (United States)

    Stursberg, Richard

    The role of the Canadian government and, in particular, of the Department of Communications (DOC) in the evolution and growth of the Canadian communications satellite industry is discussed. Activities by DOC which affect communications technology include the following: (1) DOC undertakes research and development of enabling technologies; (2) promotes the use and diffusion of these technologies through applications development; (3) negotiates spectrum and orbit arrangements in the domestic and international arena; (4) assists in the promotion and marketing of Canadian technologies abroad; and (5) has overall responsibility of telecommunications policy including development of standards and regulations. A brief description is provided of global factors which are expected to affect technology and applications development in the near future. Strategic program reviews undertaken by the Satellite Communications Application Program, the research and development program, and the Government Telecommunications Agency are described.

  3. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  4. Technique for using a geometry and visualization system to monitor and manipulate information in other codes

    Science.gov (United States)

    Dickens, Thomas P.

    1992-01-01

    A technique was developed to allow the Aero Grid and Paneling System (AGPS), a geometry and visualization system, to be used as a dynamic real-time geometry monitor, manipulator, and interrogator for other codes. This technique involves the direct connection of AGPS with one or more external codes through the use of Unix pipes. AGPS has several commands that control communication with the external program. The external program uses several special subroutines that allow simple, direct communication with AGPS. The external program creates AGPS command lines and transmits the line over the pipes or communicates on a subroutine level. AGPS executes the commands, displays graphics/geometry information, and transmits the required solutions back to the external program. The basic ideas discussed in this paper could easily be implemented in other graphics/geometry systems currently in use or under development.

  5. LDPC product coding scheme with extrinsic information for bit patterned media recoding

    Directory of Open Access Journals (Sweden)

    Seongkwon Jeong

    2017-05-01

    Full Text Available Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI and inter-track interference (ITI occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.

  6. LDPC product coding scheme with extrinsic information for bit patterned media recoding

    Science.gov (United States)

    Jeong, Seongkwon; Lee, Jaejin

    2017-05-01

    Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.

  7. What Does It Take to Produce Interpretation? Informational, Peircean, and Code-Semiotic Views on Biosemiotics

    Energy Technology Data Exchange (ETDEWEB)

    Brier, Soren; Joslyn, Cliff A.

    2013-04-01

    This paper presents a critical analysis of code-semiotics, which we see as the latest attempt to create paradigmatic foundation for solving the question of the emergence of life and consciousness. We view code semiotics as a an attempt to revise the empirical scientific Darwinian paradigm, and to go beyond the complex systems, emergence, self-organization, and informational paradigms, and also the selfish gene theory of Dawkins and the Peircean pragmaticist semiotic theory built on the simultaneous types of evolution. As such it is a new and bold attempt to use semiotics to solve the problems created by the evolutionary paradigm’s commitment to produce a theory of how to connect the two sides of the Cartesian dualistic view of physical reality and consciousness in a consistent way.

  8. Change to an informal interview dress code improves residency applicant perceptions.

    Science.gov (United States)

    Hern, H Gene; Wills, Charlotte P; Johnson, Brian

    2015-01-01

    Residency interview apparel has traditionally been the dark business suit. We changed the interview dress code from a traditionally established unwritten 'formal' attire to an explicitly described 'informal' attire. We sought to assess if the change in dress code attire changed applicants' perceptions of the residency program or decreased costs. The authors conducted an anonymous survey of applicants applying to one emergency medicine residency program during two application cycles ending in 2012 and 2013. Applicants were asked if the change in dress code affected their perception of the program, comfort level, overall costs and how it affected their rank lists. We sent the survey to 308 interviewed applicants over two years. Of those, 236 applicants completed the survey for a combined response rate of 76.6% (236/308). Among respondents, 85.1% (200 of 235) stated they appreciated the change; 66.7% (154 of 231) stated the change caused them to worry more about what to wear. Males were more uncomfortable than females due to the lack of uniformity on the interview day (18.5% of males [25/135] vs. 7.4% of females [7/95], collapsed results p-value 0.008). A total of 27.7% (64/231) agreed that the costs were less overall. The change caused 50 of 230 (21.7%) applicants to rank the program higher on their rank list and only one applicant to rank the program lower. A change to a more informal dress code resulted in more comfort and fewer costs for applicants to a single residency program. The change also resulted in some applicants placing the program higher on their rank order list.

  9. The role of stochasticity in an information-optimal neural population code

    Science.gov (United States)

    Stocks, N. G.; Nikitin, A. P.; McDonnell, M. D.; Morse, R. P.

    2009-12-01

    In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems. The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise; in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and, hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations. In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

  10. Quality optimized medical image information hiding algorithm that employs edge detection and data coding.

    Science.gov (United States)

    Al-Dmour, Hayat; Al-Ani, Ahmed

    2016-04-01

    The present work has the goal of developing a secure medical imaging information system based on a combined steganography and cryptography technique. It attempts to securely embed patient's confidential information into his/her medical images. The proposed information security scheme conceals coded Electronic Patient Records (EPRs) into medical images in order to protect the EPRs' confidentiality without affecting the image quality and particularly the Region of Interest (ROI), which is essential for diagnosis. The secret EPR data is converted into ciphertext using private symmetric encryption method. Since the Human Visual System (HVS) is less sensitive to alterations in sharp regions compared to uniform regions, a simple edge detection method has been introduced to identify and embed in edge pixels, which will lead to an improved stego image quality. In order to increase the embedding capacity, the algorithm embeds variable number of bits (up to 3) in edge pixels based on the strength of edges. Moreover, to increase the efficiency, two message coding mechanisms have been utilized to enhance the ±1 steganography. The first one, which is based on Hamming code, is simple and fast, while the other which is known as the Syndrome Trellis Code (STC), is more sophisticated as it attempts to find a stego image that is close to the cover image through minimizing the embedding impact. The proposed steganography algorithm embeds the secret data bits into the Region of Non Interest (RONI), where due to its importance; the ROI is preserved from modifications. The experimental results demonstrate that the proposed method can embed large amount of secret data without leaving a noticeable distortion in the output image. The effectiveness of the proposed algorithm is also proven using one of the efficient steganalysis techniques. The proposed medical imaging information system proved to be capable of concealing EPR data and producing imperceptible stego images with minimal

  11. Radiocarbon in marine dissolved organic carbon (DOC)

    NARCIS (Netherlands)

    Clercq, M. le; Plicht, J. van der; Meijer, H.A.J.; Baar, H.J.W. de

    Dissolved Organic Carbon (DOC) plays an important role in the ecology and carbon cycle in the ocean. Analytical problems with concentration and isotope ratio measurements have hindered its study. We have constructed a new analytical method based on supercritical oxidation for the determination of

  12. Documenting CMIP6 with ES-DOC

    Science.gov (United States)

    Greenslade, Mark

    2017-04-01

    The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding sub-mip objectives, reviewing citations, exploring component properties of configured models, visualizing inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. During this PICO session scientists will be walked through demonstrations of various aspcects of the ES-DOC eco-system. Documentation will be created & published, viewed and compared. Following the walk-throughs scientitists will realize that documenting their work has been greatly simplified and provides concrete benefits.

  13. The eDoc-Server Project Building an Institutional Repository for the Max Planck Society

    CERN Document Server

    Beier, Gerhard

    2004-01-01

    With the eDoc-Server the Heinz Nixdorf Center for Information Management in the Max Planck Society (ZIM) provides the research institutes of the Max Planck Society (MPS) with a platform to disseminate, store, and manage their scientific output. Moreover, eDoc serves as a tool to facilitate and promote open access to scientific information and primary sources. Since its introduction in October 2002 eDoc has gained high visibility within the MPS. It has been backed by strong institutional commitment to open access as documented in the 'Berlin Declaration on Open Access to the Data of the Sciences and Humanities', which was initiated by the MPS and found large support among major research organizations in Europe. This paper will outline the concept as well as the current status of the eDoc-Server, providing an example for the development and introduction of an institutional repository in a multi-disciplinary research organization.

  14. LangDoc: Bibliographic Infrastructure for Linguistic Typology

    Directory of Open Access Journals (Sweden)

    Harald Hammarström

    2011-06-01

    Full Text Available The present paper describes the ongoing project (LangDoc to make bibliography website for linguistic typology, with a near-complete database of references to documents that contain descriptive data on the languages of the world. This is intended to provide typologists with a more precise and comprehensive way to search for information on languages, and for the specific kind information that they are interested in. The annotation scheme devised is a trade-off between annotation effort and search desiderata. The end goal is a website with browse, search, update, new items subscription and download facilities, which can hopefully be enriched by spontaneous collaborative efforts.

  15. Change to an Informal Interview Dress Code Improves Residency Applicant Perceptions

    Directory of Open Access Journals (Sweden)

    Hern, H. Gene Jr.

    2014-12-01

    Full Text Available Introduction: Residency interview apparel has traditionally been the dark business suit. We changed the interview dress code from a traditionally established unwritten ‘formal’ attire to an explicitly described ‘informal’ attire. We sought to assess if the change in dress code attire changed applicants’ perceptions of the residency program or decreased costs. Methods: The authors conducted an anonymous survey of applicants applying to one emergency medicine residency program during two application cycles ending in 2012 and 2013. Applicants were asked if the change in dress code affected their perception of the program, comfort level, overall costs and how it affected their rank lists. Results: We sent the survey to 308 interviewed applicants over two years. Of those, 236 applicants completed the survey for a combined response rate of 76.6% (236/308. Among respondents, 85.1% (200 of 235 stated they appreciated the change; 66.7% (154 of 231 stated the change caused them to worry more about what to wear. Males were more uncomfortable than females due to the lack of uniformity on the interview day (18.5% of males [25/135] vs. 7.4% of females [7/95], collapsed results p-value 0.008. A total of 27.7% (64/231 agreed that the costs were less overall. The change caused 50 of 230 (21.7% applicants to rank the program higher on their rank list and only one applicant to rank the program lower. Conclusion: A change to a more informal dress code resulted in more comfort and fewer costs for applicants to a single residency program. The change also resulted in some applicants placing the program higher on their rank order list. [West J Emerg Med. 2015;16(1:127-132.

  16. Vertigo and the processing of vestibular information: A review in the context of predictive coding.

    Science.gov (United States)

    Klingner, Carsten M; Axer, Hubertus; Brodoehl, Stefan; Witte, Otto W

    2016-12-01

    This article investigates the processing of vestibular information by interpreting current experimental knowledge in the framework of predictive coding. We demonstrate that this theoretical framework give us insights into several important questions regarding specific properties of the vestibular system. Particularly, we discuss why the vestibular network is more spatially distributed than other sensory networks, why a mismatch in the vestibular system is more clinically disturbing than in other sensory systems, why the vestibular system is only marginally affected by most cerebral lesions, and whether there is a primary vestibular cortex. The use of predictive coding as a theoretical framework further points to some problems with the current interpretation of results that are gained from vestibular stimulation studies. In particular, we argue that cortical responses of vestibular stimuli cannot be interpreted in the same way as responses of other sensory modalities. Finally, we discuss the implications of the new insights, hypotheses and problems that were identified in this review on further directions of research of vestibular information processing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Models of neural networks temporal aspects of coding and information processing in biological systems

    CERN Document Server

    Hemmen, J; Schulten, Klaus

    1994-01-01

    Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of freedom. This capacity of a neural network rests on the fact that the neuronal action potential is a short, say 1 ms, spike, localized in space and time. Spatial as well as temporal correlations of activity may represent different states of a network. In particular, temporal correlations of activity may express that neurons process the same "object" of, for example, a visual scene by spiking at the very same time. The traditional description of a neural network through a firing rate, the famous S-shaped curve, presupposes a wide time window of, say, at least 100 ms. It thus fails to exploit the capacity to "bind" sets of coherently firing neurons for the purpose of both scene segmentation and figure-ground segregatio...

  18. DNA as a Binary Code: How the Physical Structure of Nucleotide Bases Carries Information

    Science.gov (United States)

    McCallister, Gary

    2005-01-01

    The DNA triplet code also functions as a binary code. Because double-ring compounds cannot bind to double-ring compounds in the DNA code, the sequence of bases classified simply as purines or pyrimidines can encode for smaller groups of possible amino acids. This is an intuitive approach to teaching the DNA code. (Contains 6 figures.)

  19. Improved Side Information Generation for Distributed Video Coding by Exploiting Spatial and Temporal Correlations

    Directory of Open Access Journals (Sweden)

    Ye Shuiming

    2009-01-01

    Full Text Available Distributed video coding (DVC is a video coding paradigm allowing low complexity encoding for emerging applications such as wireless video surveillance. Side information (SI generation is a key function in the DVC decoder, and plays a key-role in determining the performance of the codec. This paper proposes an improved SI generation for DVC, which exploits both spatial and temporal correlations in the sequences. Partially decoded Wyner-Ziv (WZ frames, based on initial SI by motion compensated temporal interpolation, are exploited to improve the performance of the whole SI generation. More specifically, an enhanced temporal frame interpolation is proposed, including motion vector refinement and smoothing, optimal compensation mode selection, and a new matching criterion for motion estimation. The improved SI technique is also applied to a new hybrid spatial and temporal error concealment scheme to conceal errors in WZ frames. Simulation results show that the proposed scheme can achieve up to 1.0 dB improvement in rate distortion performance in WZ frames for video with high motion, when compared to state-of-the-art DVC. In addition, both the objective and perceptual qualities of the corrupted sequences are significantly improved by the proposed hybrid error concealment scheme, outperforming both spatial and temporal concealments alone.

  20. "There are too many, but never enough": qualitative case study investigating routine coding of clinical information in depression.

    Directory of Open Access Journals (Sweden)

    Kathrin Cresswell

    Full Text Available BACKGROUND: We sought to understand how clinical information relating to the management of depression is routinely coded in different clinical settings and the perspectives of and implications for different stakeholders with a view to understanding how these may be aligned. MATERIALS AND METHODS: Qualitative investigation exploring the views of a purposefully selected range of healthcare professionals, managers, and clinical coders spanning primary and secondary care. RESULTS: Our dataset comprised 28 semi-structured interviews, a focus group, documents relating to clinical coding standards and participant observation of clinical coding activities. We identified a range of approaches to coding clinical information including templates and order entry systems. The challenges inherent in clearly establishing a diagnosis, identifying appropriate clinical codes and possible implications of diagnoses for patients were particularly prominent in primary care. Although a range of managerial and research benefits were identified, there were no direct benefits from coded clinical data for patients or professionals. Secondary care staff emphasized the role of clinical coders in ensuring data quality, which was at odds with the policy drive to increase real-time clinical coding. CONCLUSIONS: There was overall no evidence of clear-cut direct patient care benefits to inform immediate care decisions, even in primary care where data on patients with depression were more extensively coded. A number of important secondary uses were recognized by healthcare staff, but the coding of clinical data to serve these ends was often poorly aligned with clinical practice and patient-centered considerations. The current international drive to encourage clinical coding by healthcare professionals during the clinical encounter may need to be critically examined.

  1. Feature-selective Attention in Frontoparietal Cortex: Multivoxel Codes Adjust to Prioritize Task-relevant Information.

    Science.gov (United States)

    Jackson, Jade; Rich, Anina N; Williams, Mark A; Woolgar, Alexandra

    2017-02-01

    Human cognition is characterized by astounding flexibility, enabling us to select appropriate information according to the objectives of our current task. A circuit of frontal and parietal brain regions, often referred to as the frontoparietal attention network or multiple-demand (MD) regions, are believed to play a fundamental role in this flexibility. There is evidence that these regions dynamically adjust their responses to selectively process information that is currently relevant for behavior, as proposed by the "adaptive coding hypothesis" [Duncan, J. An adaptive coding model of neural function in prefrontal cortex. Nature Reviews Neuroscience, 2, 820-829, 2001]. Could this provide a neural mechanism for feature-selective attention, the process by which we preferentially process one feature of a stimulus over another? We used multivariate pattern analysis of fMRI data during a perceptually challenging categorization task to investigate whether the representation of visual object features in the MD regions flexibly adjusts according to task relevance. Participants were trained to categorize visually similar novel objects along two orthogonal stimulus dimensions (length/orientation) and performed short alternating blocks in which only one of these dimensions was relevant. We found that multivoxel patterns of activation in the MD regions encoded the task-relevant distinctions more strongly than the task-irrelevant distinctions: The MD regions discriminated between stimuli of different lengths when length was relevant and between the same objects according to orientation when orientation was relevant. The data suggest a flexible neural system that adjusts its representation of visual objects to preferentially encode stimulus features that are currently relevant for behavior, providing a neural mechanism for feature-selective attention.

  2. Iterative Multiview Side Information for Enhanced Reconstruction in Distributed Video Coding

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Distributed video coding (DVC is a new paradigm for video compression based on the information theoretical results of Slepian and Wolf (SW and Wyner and Ziv (WZ. DVC entails low-complexity encoders as well as separate encoding of correlated video sources. This is particularly attractive for multiview camera systems in video surveillance and camera sensor network applications, where low complexity is required at the encoder. In addition, the separate encoding of the sources implies no communication between the cameras in a practical scenario. This is an advantage since communication is time and power consuming and requires complex networking. In this work, different intercamera estimation techniques for side information (SI generation are explored and compared in terms of estimating quality, complexity, and rate distortion (RD performance. Further, a technique called iterative multiview side information (IMSI is introduced, where the final SI is used in an iterative reconstruction process. The simulation results show that IMSI significantly improves the RD performance for video with significant motion and activity. Furthermore, DVC outperforms AVC/H.264 Intra for video with average and low motion but it is still inferior to the Inter No Motion and Inter Motion modes.

  3. What determines the informativeness of firms' explanations for deviations from the Dutch corporate governance code?

    NARCIS (Netherlands)

    Hooghiemstra, R.B.H.

    2012-01-01

    The comply-or-explain principle is a common feature of corporate governance codes. While prior studies investigated compliance with corporate governance codes as well as the effects of compliance on firm behaviour and performance, explanations for deviations from a corporate governance code remain

  4. CFC (Comment-First-Coding)--A Simple yet Effective Method for Teaching Programming to Information Systems Students

    Science.gov (United States)

    Sengupta, Arijit

    2009-01-01

    Programming courses have always been a difficult part of an Information Systems curriculum. While we do not train Information Systems students to be developers, understanding how to build a system always gives students an added perspective to improve their system design and analysis skills. This teaching tip presents CFC (Comment-First-Coding)--a…

  5. Rank Order Coding: a Retinal Information Decoding Strategy Revealed by Large-Scale Multielectrode Array Retinal Recordings.

    Science.gov (United States)

    Portelli, Geoffrey; Barrett, John M; Hilgen, Gerrit; Masquelier, Timothée; Maccione, Alessandro; Di Marco, Stefano; Berdondini, Luca; Kornprobst, Pierre; Sernagor, Evelyne

    2016-01-01

    How a population of retinal ganglion cells (RGCs) encodes the visual scene remains an open question. Going beyond individual RGC coding strategies, results in salamander suggest that the relative latencies of a RGC pair encode spatial information. Thus, a population code based on this concerted spiking could be a powerful mechanism to transmit visual information rapidly and efficiently. Here, we tested this hypothesis in mouse by recording simultaneous light-evoked responses from hundreds of RGCs, at pan-retinal level, using a new generation of large-scale, high-density multielectrode array consisting of 4096 electrodes. Interestingly, we did not find any RGCs exhibiting a clear latency tuning to the stimuli, suggesting that in mouse, individual RGC pairs may not provide sufficient information. We show that a significant amount of information is encoded synergistically in the concerted spiking of large RGC populations. Thus, the RGC population response described with relative activities, or ranks, provides more relevant information than classical independent spike count- or latency- based codes. In particular, we report for the first time that when considering the relative activities across the whole population, the wave of first stimulus-evoked spikes is an accurate indicator of stimulus content. We show that this coding strategy coexists with classical neural codes, and that it is more efficient and faster. Overall, these novel observations suggest that already at the level of the retina, concerted spiking provides a reliable and fast strategy to rapidly transmit new visual scenes.

  6. Analog Coding.

    Science.gov (United States)

    CODING, ANALOG SYSTEMS), INFORMATION THEORY, DATA TRANSMISSION SYSTEMS , TRANSMITTER RECEIVERS, WHITE NOISE, PROBABILITY, ERRORS, PROBABILITY DENSITY FUNCTIONS, DIFFERENTIAL EQUATIONS, SET THEORY, COMPUTER PROGRAMS

  7. Distributed multi-hypothesis coding of depth maps using texture motion information and optical flow

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Rakêt, Lars Lau

    2013-01-01

    Distributed Video Coding (DVC) is a video coding paradigm allowing a shift of complexity from the encoder to the decoder. Depth maps are images enabling the calculation of the distance of an object from the camera, which can be used in multiview coding in order to generate virtual views, but also...... in single view coding for motion detection or image segmentation. In this work, we address the problem of depth map video DVC encoding in a single-view scenario. We exploit the motion of the corresponding texture video which is highly correlated with the depth maps. In order to extract the motion...

  8. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  9. Spatial coding of ordinal information in short- and long-term memory

    Directory of Open Access Journals (Sweden)

    Veronique eGinsburg

    2015-01-01

    Full Text Available The processing of numerical information induces a spatial response bias: Faster responses to small numbers with the left hand and faster responses to large numbers with the right hand. Most theories agree that long-term representations underlie this so called SNARC effect (Spatial Numerical Association of Response Codes; Dehaene, Bossini, & Giraux, 1993. However, a spatial response bias was also observed with the activation of temporary position-space associations in working memory (ordinal position effect; van Dijck & Fias, 2011. Items belonging to the beginning of a memorized sequence are responded to faster with the left hand side while items at the end of the sequence are responded to faster with the right hand side. The theoretical possibility was put forward that the SNARC effect is an instance of the ordinal position effect, with the empirical consequence that the SNARC effect and the ordinal position effect cannot be observed simultaneously. In two experiments we falsify this claim by demonstrating that the SNARC effect and the ordinal position effect are not mutually exclusive. Consequently, this suggests that the SNARC effect and the ordinal position effect result from the activation of different representations. We conclude that spatial response biases can result from the activation of both pre-existing positions in long-term memory and from temporary space associations in working memory at the same time.

  10. Stereo side information generation in low-delay distributed stereo video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Forchhammer, Søren

    2012-01-01

    Distributed Video Coding (DVC) is a technique that allows shifting the computational complexity from the encoder to the decoder. One of the core elements of the decoder is the creation of the Side Information (SI), which is a hypothesis of what the to-be-decoded frame looks like. Much work on DVC...... has been carried out: often the decoder can use future and past frames in order to obtain the SI exploiting the time redundancy. Other work has addressed a Multiview scenario; exploiting the frames coming from cameras close to the one we are decoding (usually a left and right camera) it is possible...... to create SI exploiting the inter-view spatial redundancy. A careful fusion of the two SI should be done in order to use the best part of each SI. In this work we study a Stereo Low-Delay scenario using only two views. Due to the delay constraint we use only past frames of the sequence we are decoding...

  11. Google Docs: an experience in collaborative work in the University

    Directory of Open Access Journals (Sweden)

    Vanesa DELGADO BENITO

    2013-01-01

    Full Text Available The educational environment contains multiple reasons to make use of the new possibilities that Information and Communication Technologies (ICT as an educational resource offer. The educational experience presented here has been realized in the subject of New Technologies applied to Education, which forms part of the study plans for primary school teachers in the University of Burgos (UBU, and which has as its main goal to facilitate the acquisition of generic competences of ICT to work online. To reach this proposed goal, we have cultivated active learning of the students, from individual to collective learning. At first, they were given a text to work individually, to read and review. After that, groups were created to work on the document cooperatively, online, through the use of the office tool Google Docs. After sharing and editing the document, every group made a multimedia presentation in which all of their contributions are bundled. Finally, all of the presentations made by every one of the groups were made public. When the practical part of the course was done, the students answered a short questionnaire in which they were asked about their initial knowledge, and the level of dominion and didactic usefulness of the tool Google Docs. It is worth noting that 75% of the class did not know the application before the course and that, after using it, 92% say they would use it in the educational and professional future. This educational experience has been very satisfactory for students and professors alike.

  12. Distributed source coding of video with non-stationary side-information

    NARCIS (Netherlands)

    Meyer, P.F.A.; Westerlaken, R.P.; Klein Gunnewiek, R.; Lagendijk, R.L.

    2005-01-01

    In distributed video coding, the complexity of the video encoder is reduced at the cost of a more complex video decoder. Using the principles of Slepian andWolf, video compression is then carried out using channel coding principles, under the assumption that the video decoder can temporally predict

  13. DOC Molecule Transporter and Transformation in Marine Microbes

    Science.gov (United States)

    Zheng, Q.; Jiao, N.

    2016-02-01

    The ocean acts as a "sink" of atmospheric CO2, thus mitigating global warming. The recognized biological mechanism for this sink is the "biological pump", which is based on the photosynthetic fixation of CO2 and subsequent carbon export driven mainly by sinking of particulate organic carbon (POC). The ocean possesses a huge dissolved organic carbon (DOC) pool which accounts for about 95% of the total remaining organic carbon. The majority of this DOC pool is recalcitrant to biological degradation and can persist in the water column for thousands of years, constituting carbon sequestration in the ocean. Recently a new concept has been proposed to address this matter, the "microbial carbon pump (MCP)", which refers to the microbial processes that transform labile DOC (LDOC) to recalcitrant DOC (RDOC). The transformation of DOC is carried out by marine microbes, and the pathways and rates of DOC transformation determine the fate and the amount of carbon converted ultimately to CO2 or RDOC. The DOC pool consists of thousands of organic carbon compounds with different biological turnover rates, biological availabilities, and biogeochemical features. While microbial processes modify the composition of the DOC pool, the availability of DOC compounds to microbes shapes microbial diversity and community structure. For instance, the Roseobacter clade and SAR11 clade are dominant bacterial groups in relatively eutrophic and oligotrophic waters respectively. Each clade has different strategies for carbon utilization, and thus different responses to and impacts on the DOC pool in the ocean.

  14. Video coding for 3D-HEVC based on saliency information

    Science.gov (United States)

    Yu, Fang; An, Ping; Yang, Chao; You, Zhixiang; Shen, Liquan

    2016-11-01

    As an extension of High Efficiency Video Coding ( HEVC), 3D-HEVC has been widely researched under the impetus of the new generation coding standard in recent years. Compared with H.264/AVC, its compression efficiency is doubled while keeping the same video quality. However, its higher encoding complexity and longer encoding time are not negligible. To reduce the computational complexity and guarantee the subjective quality of virtual views, this paper presents a novel video coding method for 3D-HEVC based on the saliency informat ion which is an important part of Human Visual System (HVS). First of all, the relationship between the current coding unit and its adjacent units is used to adjust the maximum depth of each largest coding unit (LCU) and determine the SKIP mode reasonably. Then, according to the saliency informat ion of each frame image, the texture and its corresponding depth map will be divided into three regions, that is, salient area, middle area and non-salient area. Afterwards, d ifferent quantization parameters will be assigned to different regions to conduct low complexity coding. Finally, the compressed video will generate new view point videos through the renderer tool. As shown in our experiments, the proposed method saves more bit rate than other approaches and achieves up to highest 38% encoding time reduction without subjective quality loss in compression or rendering.

  15. Creating Tomorrow's Technologists: Contrasting Information Technology Curriculum in North American Library and Information Science Graduate Programs against Code4lib Job Listings

    Science.gov (United States)

    Maceli, Monica

    2015-01-01

    This research study explores technology-related course offerings in ALA-accredited library and information science (LIS) graduate programs in North America. These data are juxtaposed against a text analysis of several thousand LIS-specific technology job listings from the Code4lib jobs website. Starting in 2003, as a popular library technology…

  16. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    Science.gov (United States)

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  17. Secret information reconciliation based on punctured low-density parity-check codes for continuous-variable quantum key distribution

    Science.gov (United States)

    Jiang, Xue-Qin; Huang, Peng; Huang, Duan; Lin, Dakai; Zeng, Guihua

    2017-02-01

    Achieving information theoretic security with practical complexity is of great interest to continuous-variable quantum key distribution in the postprocessing procedure. In this paper, we propose a reconciliation scheme based on the punctured low-density parity-check (LDPC) codes. Compared to the well-known multidimensional reconciliation scheme, the present scheme has lower time complexity. Especially when the chosen punctured LDPC code achieves the Shannon capacity, the proposed reconciliation scheme can remove the information that has been leaked to an eavesdropper in the quantum transmission phase. Therefore, there is no information leaked to the eavesdropper after the reconciliation stage. This indicates that the privacy amplification algorithm of the postprocessing procedure is no more needed after the reconciliation process. These features lead to a higher secret key rate, optimal performance, and availability for the involved quantum key distribution scheme.

  18. Changes in DOC treatability: Indications of compositional changes in DOC trends

    Science.gov (United States)

    Worrall, F.; Burt, T. P.

    2009-03-01

    SummaryThis study considers long-term records of the nature of water colour and coincident water quality and quantity in order to test hypotheses about increases in release of dissolved organic carbon from peat soils across the northern hemisphere. The study focuses upon the treatment ratio, i.e. the ratio of the amount of coagulant dose required to the water colour of the incoming water, and compares this ratio to possible explanatory variables: pH, conductivity, water temperature, river flow and alkalinity. The study shows that: The annual average increase in treatment ratio is just less than 6.5% over a six year period. There is a long-term increase in the treatment ratio that is independent of changes in riverflow, pH, conductivity, water temperature and alkalinity and so a real change in DOC composition is occurring. There is a seasonal cycle in treatment ratio that is also independent of the available water and climate variables. The upward trend in treatment ratio is declining with time over the study period. The observed trends in DOC composition are consistent with an explanation of increasing DOC concentration and flux based upon changes in flow and temperature but is not consistent with present explanations based upon changes in atmospheric deposition or upon drought unless the effect of the drought are short-lived (1-2 years).

  19. Investigations of PAA degradation in aqueous solutions: Impacts of water hardness, salinity and DOC

    Science.gov (United States)

    Peracetic acid (PAA) is used in aquaculture under various conditions for disinfection purposes. However, there is lack of information about its environmental fate. Therefore, the impact of water hardness, salinity, and dissolved organic carbon (DOC) on PAA-degradation within 5 hours was investigat...

  20. Sean Michaletz Directors Post Doc Fellow Report

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Cathy Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-30

    Predicting climate change effects on plant function is a central challenge of global change biology and a primary mission of DOE. Although increasing temperatures and drought have been associated with reduced growth and increased mortality of plants, accurate prediction of such responses is limited by a lack of process-based theory linking climate and whole-plant physiology. This inability to predict forest mortality can cause significant biases in climate forecasts. One way forward is metabolic scaling theory (MST), which proposes that physiologic rates – from cells to the globe – are governed by the rates of resource distribution through vascular networks and the kinetics of resource utilization by metabolic reactions. MST has traditionally not considered rates of resource acquisition from organism-environment interactions, but it has an ideal mechanistic basis for doing so. As a first step towards integrating these processes, Sean has extended MST to characterize effects of temperature and precipitation on plant growth and ecosystem production. Sean’s post doc fellowship aimed to address a remaining shortcoming in that the new theory does not yet consider the physical processes of resource acquisition, and thus cannot mechanistically predict plant performance in a changing climate.

  1. 76 FR 9210 - Draft DOC National Aquaculture Policy

    Science.gov (United States)

    2011-02-16

    ... DOC National Aquaculture Policy AGENCY: Commerce. ACTION: Notice of availability of draft aquaculture... draft national aquaculture policy that supports sustainable aquaculture in the United States. The intent of the policy is to guide DOC's actions and decisions on aquaculture and to provide a national...

  2. Rare diseases in ICD11: making rare diseases visible in health information systems through appropriate coding.

    Science.gov (United States)

    Aymé, Ségolène; Bellet, Bertrand; Rath, Ana

    2015-03-26

    Because of their individual rarity, genetic diseases and other types of rare diseases are under-represented in healthcare coding systems; this contributes to a lack of ascertainment and recognition of their importance for healthcare planning and resource allocation, and prevents clinical research from being performed. Orphanet was given the task to develop an inventory of rare diseases and a classification system which could serve as a template to update International terminologies. When the World Health Organization (WHO) launched the revision process of the International Classification of Diseases (ICD), a Topic Advisory Group for rare diseases was established, managed by Orphanet and funded by the European Commission. So far 5,400 rare diseases listed in the Orphanet database have an endorsed representation in the foundation layer of ICD-11, and are thus provided with a unique identifier in the Beta version of ICD-11, which is 10 times more than in ICD10. A rare disease linearization is also planned. The current beta version is open for public consultation and comments, and to be used for field testing. The adoption by the World Health Assembly is planned for 2017. The overall revision process was carried out with very limited means considering its scope, ambition and strategic significance, and experienced significant hurdles and setbacks. The lack of funding impacted the level of professionalism that could be attained. The contrast between the initially declared goals and the currently foreseen final product is disappointing. In the context of uncertainty around the outcome of the field testing and the potential willingness of countries to adopt this new version, the European Commission Expert Group on Rare Diseases adopted in November 2014 a recommendation for health care coding systems to consider using ORPHA codes in addition to ICD10 codes for rare diseases having no specific ICD10 codes. The Orphanet terminology, classifications and mappings with other

  3. Information rates of probabilistically shaped coded modulation for a multi-span fiber-optic communication system with 64QAM

    Science.gov (United States)

    Fehenberger, Tobias

    2018-02-01

    This paper studies probabilistic shaping in a multi-span wavelength-division multiplexing optical fiber system with 64-ary quadrature amplitude modulation (QAM) input. In split-step fiber simulations and via an enhanced Gaussian noise model, three figures of merit are investigated, which are signal-to-noise ratio (SNR), achievable information rate (AIR) for capacity-achieving forward error correction (FEC) with bit-metric decoding, and the information rate achieved with low-density parity-check (LDPC) FEC. For the considered system parameters and different shaped input distributions, shaping is found to decrease the SNR by 0.3 dB yet simultaneously increases the AIR by up to 0.4 bit per 4D-symbol. The information rates of LDPC-coded modulation with shaped 64QAM input are improved by up to 0.74 bit per 4D-symbol, which is larger than the shaping gain when considering AIRs. This increase is attributed to the reduced coding gap of the higher-rate code that is used for decoding the nonuniform QAM input.

  4. Energy Efficiency Requirements in Building Codes, Energy Efficiency Policies for New Buildings. IEA Information Paper

    Energy Technology Data Exchange (ETDEWEB)

    Laustsen, Jens

    2008-03-15

    The aim of this paper is to describe and analyse current approaches to encourage energy efficiency in building codes for new buildings. Based on this analysis the paper enumerates policy recommendations for enhancing how energy efficiency is addressed in building codes and other policies for new buildings. This paper forms part of the IEA work for the G8 Gleneagles Plan of Action. These recommendations reflect the study of different policy options for increasing energy efficiency in new buildings and examination of other energy efficiency requirements in standards or building codes, such as energy efficiency requirements by major renovation or refurbishment. In many countries, energy efficiency of buildings falls under the jurisdiction of the federal states. Different standards cover different regions or climatic conditions and different types of buildings, such as residential or simple buildings, commercial buildings and more complicated high-rise buildings. There are many different building codes in the world and the intention of this paper is not to cover all codes on each level in all countries. Instead, the paper details different regions of the world and different ways of standards. In this paper we also evaluate good practices based on local traditions. This project does not seek to identify one best practice amongst the building codes and standards. Instead, different types of codes and different parts of the regulation have been illustrated together with examples on how they have been successfully addressed. To complement this discussion of efficiency standards, this study illustrates how energy efficiency can be improved through such initiatives as efficiency labelling or certification, very best practice buildings with extremely low- or no-energy consumption and other policies to raise buildings' energy efficiency beyond minimum requirements. When referring to the energy saving potentials for buildings, this study uses the analysis of recent IEA

  5. Dissolved Organic Carbon in Marginal, Damaged Peatlands: Using 14C to Understand DOC Losses

    Science.gov (United States)

    Luscombe, D.; Grand-Clement, E.; Garnett, M.; Anderson, K.; Gatis, N.; Benaud, P.; Brazier, R.

    2013-12-01

    in an area of shallow peat (ca. 20-30 cm depth) drained by a medium size ditch (50 x 50 cm). Samples of DOC from stream water were taken at low and high flow during 3 separate rain events in Winter- Spring 2013 using automatic pump samplers. Samples of DOC in pore water were taken 2 m away from the ditch at 5 and 15 cm depth on two occasions. Finally, matching bulk peat samples were collected at 5 and 15 cm depth. Intensive monitoring data also provides information on water table depth and level in streams. A neighbouring pristine peat area was used as a control, and DOC pore water and bulk peat soil samples were taken at 5, 15 and 45 cm depth on two occasions. Preliminary results show that DOC lost in streams at high flow contains a greater contribution of bomb-14C compared to that at low flow (107 and 101 % modern respectively). Stream water DOC at low flow had a 14C concentration lower than that in pore water at both 5 and 15 cm depth (105 and 102% modern, respectively), suggesting that low flow stream water DOC is predominantly older than that found in pore water at depth.

  6. Vein Pattern Recognition Using Chain Codes, Spatial Information and Skeleton Fusing

    OpenAIRE

    Hartung, Daniel; Pflug, Anika; Busch, Christoph

    2012-01-01

    Vein patterns are a unique attribute of each individual and can therefore be used as a biometric characteristic. Exploiting the specific near infrared light absorption properties of blood, the vein capture procedure is convenient and allows contact-less sensors. We propose a new chain code based feature encoding method, using spacial and orientation properties of vein patterns. The proposed comparison method has been evaluated in a series of different experiments in single and ...

  7. Wyner-Ziv Coding of Depth Maps Exploiting Color Motion Information

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    of depth maps exploiting corresponding color information is proposed. Due to the high correlation of the motion in color and corresponding depth videos, motion information from the decoded color signal can effectively be exploited to generate accurate side information for the depth signal, allowing...

  8. 75 FR 29531 - Agency Information Collection Activities; Proposed Collection; Comment Request; Clean Watersheds...

    Science.gov (United States)

    2010-05-26

    ... will not know your identity or contact information unless you provide it in the body of your comment... burden saved by switching from a census to a sampling approach. What is the next step in the process for... Management. [FR Doc. 2010-12651 Filed 5-25-10; 8:45 am] BILLING CODE 6560-50-P ...

  9. Fly photoreceptors demonstrate energy-information trade-offs in neural coding.

    Directory of Open Access Journals (Sweden)

    Jeremy E Niven

    2007-04-01

    Full Text Available Trade-offs between energy consumption and neuronal performance must shape the design and evolution of nervous systems, but we lack empirical data showing how neuronal energy costs vary according to performance. Using intracellular recordings from the intact retinas of four flies, Drosophila melanogaster, D. virilis, Calliphora vicina, and Sarcophaga carnaria, we measured the rates at which homologous R1-6 photoreceptors of these species transmit information from the same stimuli and estimated the energy they consumed. In all species, both information rate and energy consumption increase with light intensity. Energy consumption rises from a baseline, the energy required to maintain the dark resting potential. This substantial fixed cost, approximately 20% of a photoreceptor's maximum consumption, causes the unit cost of information (ATP molecules hydrolysed per bit to fall as information rate increases. The highest information rates, achieved at bright daylight levels, differed according to species, from approximately 200 bits s(-1 in D. melanogaster to approximately 1,000 bits s(-1 in S. carnaria. Comparing species, the fixed cost, the total cost of signalling, and the unit cost (cost per bit all increase with a photoreceptor's highest information rate to make information more expensive in higher performance cells. This law of diminishing returns promotes the evolution of economical structures by severely penalising overcapacity. Similar relationships could influence the function and design of many neurons because they are subject to similar biophysical constraints on information throughput.

  10. Consequences of converting graded to action potentials upon neural information coding and energy efficiency.

    Directory of Open Access Journals (Sweden)

    Biswa Sengupta

    2014-01-01

    Full Text Available Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na(+ and K(+ channels, with generator potential and graded potential models lacking voltage-gated Na(+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1 the voltage-gated Na(+ channels necessary for action potential generation increase intrinsic noise and (2 introduce non-linearities, and (3 the finite duration of the action potential creates a 'footprint' in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation.

  11. Too Few Women, Docs Understand Dangers of Heart Disease

    Science.gov (United States)

    ... html Too Few Women, Docs Understand Dangers of Heart Disease It kills more than all cancers combined, but ... 22, 2017 THURSDAY, June 22, 2017 (HealthDay News) -- Heart disease is the leading killer of U.S. women, but ...

  12. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems.

    Science.gov (United States)

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    Characterizing a rare disease diagnosis for a given patient is often made through expert's networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine.

  13. The information coded in the yeast response elements accounts for most of the topological properties of its transcriptional regulation network.

    Directory of Open Access Journals (Sweden)

    Duygu Balcan

    Full Text Available The regulation of gene expression in a cell relies to a major extent on transcription factors, proteins which recognize and bind the DNA at specific binding sites (response elements within promoter regions associated with each gene. We present an information theoretic approach to modeling transcriptional regulatory networks, in terms of a simple "sequence-matching" rule and the statistics of the occurrence of binding sequences of given specificity in random promoter regions. The crucial biological input is the distribution of the amount of information coded in these cognate response elements and the length distribution of the promoter regions. We provide an analysis of the transcriptional regulatory network of yeast Saccharomyces cerevisiae, which we extract from the available databases, with respect to the degree distributions, clustering coefficient, degree correlations, rich-club coefficient and the k-core structure. We find that these topological features are in remarkable agreement with those predicted by our model, on the basis of the amount of information coded in the interaction between the transcription factors and response elements.

  14. The coding and transmission of information by means of road lighting.

    NARCIS (Netherlands)

    Schreuder, D.A.

    1971-01-01

    A survey of car lighting and overhead lighting systems and principles is given. Some theoretical considerations about driving information are discussed. Desirability for lighting on urban and rural roads (both with light traffic or dense traffic) and motorways are proposed.

  15. From Winner-Takes-All to Winners-Share-All: Exploiting the Information Capacity in Temporal Codes.

    Science.gov (United States)

    Payvand, Melika; Theogarajan, Luke

    2017-12-08

    In this letter, we have implemented and compared two neural coding algorithms in the networks of spiking neurons: Winner-takes-all (WTA) and winners-share-all (WSA). Winners-Share-All exploits the code space provided by the temporal code by training a different combination of [Formula: see text] out of [Formula: see text] neurons to fire together in response to different patterns, while WTA uses a one-hot-coding to respond to distinguished patterns. Using WSA, the maximum value of [Formula: see text] in order to maximize information capacity using [Formula: see text] output neurons was theoretically determined and utilized. A small proof-of-concept classification problem was applied to a spiking neural network using both algorithms to classify 14 letters of English alphabet with an image size of 15 [Formula: see text] 15 pixels. For both schemes, a modified spike-timing-dependent-plasticity (STDP) learning rule has been used to train the spiking neurons in an unsupervised fashion. The performance and the number of neurons required to perform this computation are compared between the two algorithms. We show that by tolerating a small drop in performance accuracy (84% in WSA versus 91% in WTA), we are able to reduce the number of output neurons by more than a factor of two. We show how the reduction in the number of neurons will increase as the number of patterns increases. The reduction in the number of output neurons would then proportionally reduce the number of training parameters, which requires less memory and hence speeds up the computation, and in the case of neuromorphic implementation on silicon, would take up much less area.

  16. Uropygial gland volatiles may code for olfactory information about sex, individual, and species in Bengalese finches Lonchura striata

    Directory of Open Access Journals (Sweden)

    Jian-Xu ZHANG et al.

    2009-10-01

    Full Text Available Over-shadowed by eye-catching vocal and visual signals, chemical communication has long been overlooked in birds. This study aimed at exploring whether volatile composition of the uropygial gland secretion (UGS of birds was associated with the information about sex, individual and species. By using dichloromethane extraction and gas chromatography-mass spectrometry (GC-MS, we analyzed the UGS volatiles of domesticated Bengalese finches (Lonchura striata, Estrildiea which is also known as white-rumped munias. We characterized 16 volatile molecules from the UGS, including eight n-alkanols, five diesters, an ester, an aldehyde and a fatty acid, and quantified them in terms of GC peak area percentages (relative abundances. Among these compounds, hexadecanol and octadecanol were major components in both sexes. The former was richer in males than females and the latter richer in females than males, suggesting that they might be male and female pheromone candidates, respectively. The high inter-individual variations, in relative abundance, of the UGS volatiles implied that they might carry information about individuality. The similarity between GC profiles of the UGS and wing feather from same individuals indicates that the birds might preen the secretion to their feathers to transmit chemical cues. Additionally, by comparing with three sympatric passerine species,i.e., zebra finches Taeniopygia guttata, yellow-bowed buntings Emberiza chrysophrys and rooks Corvus frugilegus, we found that the composition of C13-C18 alkanols in the UGS might contain information about species. Our study also show that quantitative differences (degree of same UGS volatiles might be the key for the Bengalese finch to code for information about sex and individuality whereas both the kind and degree of UGS constituents could be utilized to code for information about species [Current Zoology 55 (5:–2009].

  17. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  18. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  19. Expression Quantitative Trait Loci Information Improves Predictive Modeling of Disease Relevance of Non-Coding Genetic Variation.

    Directory of Open Access Journals (Sweden)

    Damien C Croteau-Chonka

    Full Text Available Disease-associated loci identified through genome-wide association studies (GWAS frequently localize to non-coding sequence. We and others have demonstrated strong enrichment of such single nucleotide polymorphisms (SNPs for expression quantitative trait loci (eQTLs, supporting an important role for regulatory genetic variation in complex disease pathogenesis. Herein we describe our initial efforts to develop a predictive model of disease-associated variants leveraging eQTL information. We first catalogued cis-acting eQTLs (SNPs within 100 kb of target gene transcripts by meta-analyzing four studies of three blood-derived tissues (n = 586. At a false discovery rate < 5%, we mapped eQTLs for 6,535 genes; these were enriched for disease-associated genes (P < 10(-04, particularly those related to immune diseases and metabolic traits. Based on eQTL information and other variant annotations (distance from target gene transcript, minor allele frequency, and chromatin state, we created multivariate logistic regression models to predict SNP membership in reported GWAS. The complete model revealed independent contributions of specific annotations as strong predictors, including evidence for an eQTL (odds ratio (OR = 1.2-2.0, P < 10(-11 and the chromatin states of active promoters, different classes of strong or weak enhancers, or transcriptionally active regions (OR = 1.5-2.3, P < 10(-11. This complete prediction model including eQTL association information ultimately allowed for better discrimination of SNPs with higher probabilities of GWAS membership (6.3-10.0%, compared to 3.5% for a random SNP than the other two models excluding eQTL information. This eQTL-based prediction model of disease relevance can help systematically prioritize non-coding GWAS SNPs for further functional characterization.

  20. DOC-dynamics in a small headwater catchment as driven by redox fluctuations and hydrological flow paths – are DOC exports mediated by iron reduction/oxidation cycles?

    Directory of Open Access Journals (Sweden)

    K.-H. Knorr

    2013-02-01

    Full Text Available Dissolved organic carbon (DOC exports from many catchments in Europe and North-America are steadily increasing. Several studies have sought to explain this observation. As possible causes, a decrease in acid rain or sulfate deposition, concomitant reductions in ionic strength and increasing temperatures were identified. DOC often originates from riparian wetlands; but here, despite higher DOC concentrations, ionic strength in pore waters usually exceeds that in surface waters. In the catchment under study, DOC concentrations were synchronous with dissolved iron concentrations in pore and stream water. This study aims at testing the hypothesis that DOC exports are mediated by iron reduction/oxidation cycles. Following the observed hydrographs, δ18O of water and DOC fluorescence, the wetlands were identified as the main source of DOC. Antecedent biogeochemical conditions, i.e., water table levels in the wetlands, influenced the discharge patterns of nitrate, iron and DOC during an event. The correlation of DOC with pH was positive in pore waters, but negative in surface waters; it was negative for DOC with sulfate in pore waters, but only weak in surface waters. Though, the positive correlation of DOC with iron was universal for pore and surface water. The decline of DOC and iron concentrations in transition from anoxic wetland pore water to oxic stream water suggests a flocculation of DOC with oxidising iron, leading to a drop in pH in the stream during high DOC fluxes. The pore water did not per se differ in pH. There is, thus, a need to consider processes more thoroughly of DOC mobilisation in wetlands when interpreting DOC exports from catchments. The coupling of DOC with iron fluxes suggested that increased DOC exports could at least, in part, be caused by increasing activities in iron reduction, possibly due to increases in temperature, increasing wetness of riparian wetlands, or by a shift from sulfate dominated to iron

  1. Understanding Neural Population Coding: Information Theoretic Insights from the Auditory System

    Directory of Open Access Journals (Sweden)

    Arno Onken

    2014-01-01

    Full Text Available In recent years, our research in computational neuroscience has focused on understanding how populations of neurons encode naturalistic stimuli. In particular, we focused on how populations of neurons use the time domain to encode sensory information. In this focused review, we summarize this recent work from our laboratory. We focus in particular on the mathematical methods that we developed for the quantification of how information is encoded by populations of neurons and on how we used these methods to investigate the encoding of complex naturalistic sounds in auditory cortex. We review how these methods revealed a complementary role of low frequency oscillations and millisecond precise spike patterns in encoding complex sounds and in making these representations robust to imprecise knowledge about the timing of the external stimulus. Further, we discuss challenges in extending this work to understand how large populations of neurons encode sensory information. Overall, this previous work provides analytical tools and conceptual understanding necessary to study the principles of how neural populations reflect sensory inputs and achieve a stable representation despite many uncertainties in the environment.

  2. 0743-0750_ESM.doc

    Indian Academy of Sciences (India)

    A web server for predicting proteins involved in pluripotent network. Sukhen Das Mandal and Sudipto Saha. Supplementary information. Figure 1. A. The graph is probability density function curve with respect to SVM score of positive training set. Red shaded area is the percentage of positive training data point whose SVM ...

  3. supp22check CIF 1.doc

    Indian Academy of Sciences (India)

    5 ALERT type 1 CIF construction/syntax error, inconsistent or missing data. 7 ALERT type 2 Indicator that the structure model may be wrong or deficient. 1 ALERT type 3 Indicator that the structure quality may be low. 1 ALERT type 4 Improvement, methodology, query or suggestion. 3 ALERT type 5 Informative message, ...

  4. Multiplexing stimulus information through rate and temporal codes in primate somatosensory cortex.

    Directory of Open Access Journals (Sweden)

    Michael A Harvey

    Full Text Available Our ability to perceive and discriminate textures relies on the transduction and processing of complex, high-frequency vibrations elicited in the fingertip as it is scanned across a surface. How naturalistic vibrations, and by extension texture, are encoded in the responses of neurons in primary somatosensory cortex (S1 is unknown. Combining single unit recordings in awake macaques and perceptual judgments obtained from human subjects, we show that vibratory amplitude is encoded in the strength of the response evoked in S1 neurons. In contrast, the frequency composition of the vibrations, up to 800 Hz, is not encoded in neuronal firing rates, but rather in the phase-locked responses of a subpopulation of neurons. Moreover, analysis of perceptual judgments suggests that spike timing not only conveys stimulus information but also shapes tactile perception. We conclude that information about the amplitude and frequency of natural vibrations is multiplexed at different time scales in S1, and encoded in the rate and temporal patterning of the response, respectively.

  5. Color-coded depth information in volume-rendered magnetic resonance angiography

    Science.gov (United States)

    Smedby, Orjan; Edsborg, Karin; Henriksson, John

    2004-05-01

    Magnetic Resonance Angiography (MRA) and Computed Tomography Angiography (CTA) data are usually presented using Maximum Intensity Projection (MIP) or Volume Rendering Technique (VRT), but these often fail to demonstrate a stenosis if the projection angle is not suitably chosen. In order to make vascular stenoses visible in projection images independent of the choice of viewing angle, a method is proposed to supplement these images with colors representing the local caliber of the vessel. After preprocessing the volume image with a median filter, segmentation is performed by thresholding, and a Euclidean distance transform is applied. The distance to the background from each voxel in the vessel is mapped to a color. These colors can either be rendered directly using MIP or be presented together with opacity information based on the original image using VRT. The method was tested in a synthetic dataset containing a cylindrical vessel with stenoses in varying angles. The results suggest that the visibility of stenoses is enhanced by the color information. In clinical feasibility experiments, the technique was applied to clinical MRA data. The results are encouraging and indicate that the technique can be used with clinical images.

  6. Medical data analysis and coding using natural language processing techniques in order to derive structured data information.

    Science.gov (United States)

    Nikiforou, Aggeliki; Ponirou, Paraskevi; Diomidous, Marianna

    2013-01-01

    Medical data are, most of the times, very complex both in form and content. One of the greatest challenges for the IT community in healthcare is to enable the full utilization of these data by information systems. This explicit variety combined with the fact that data usually derives from diverse systems are great obstacles to this task. The result is that data stored in medical information systems usually do not accurately represent reality. In order to eliminate the fallacy between stored and real data, specialized applications that facilitate and accelerate data import into information systems must be developed. This is the goal of Natural Language Processing, the scientific field that combines computer science and linguistics. As a result NLP systems use applications for the coding and standardization of information, known as controlled medical vocabularies. The result of these processes is data that can be used by various technologies, such as clinical data warehouses and decision support systems, the functionality of which is fully dependable on the completeness and accuracy of the data on which their analysis is imposed.

  7. Multi-scale coding of genomic information: From DNA sequence to genome structure and function

    Energy Technology Data Exchange (ETDEWEB)

    Arneodo, Alain, E-mail: alain.arneodo@ens-lyon.f [Universite de Lyon, F-69000 Lyon (France); Laboratoire Joliot-Curie and Laboratoire de Physique, CNRS, Ecole Normale Superieure de Lyon, F-69007 Lyon (France); Vaillant, Cedric, E-mail: cedric.vaillant@ens-lyon.f [Universite de Lyon, F-69000 Lyon (France); Laboratoire Joliot-Curie and Laboratoire de Physique, CNRS, Ecole Normale Superieure de Lyon, F-69007 Lyon (France); Audit, Benjamin, E-mail: benjamin.audit@ens-lyon.f [Universite de Lyon, F-69000 Lyon (France); Laboratoire Joliot-Curie and Laboratoire de Physique, CNRS, Ecole Normale Superieure de Lyon, F-69007 Lyon (France); Argoul, Francoise, E-mail: francoise.argoul@ens-lyon.f [Universite de Lyon, F-69000 Lyon (France); Laboratoire Joliot-Curie and Laboratoire de Physique, CNRS, Ecole Normale Superieure de Lyon, F-69007 Lyon (France); D' Aubenton-Carafa, Yves, E-mail: daubenton@cgm.cnrs-gif.f [Centre de Genetique Moleculaire, CNRS, Allee de la Terrasse, 91198 Gif-sur-Yvette (France); Thermes, Claude, E-mail: claude.thermes@cgm.cnrs-gif.f [Centre de Genetique Moleculaire, CNRS, Allee de la Terrasse, 91198 Gif-sur-Yvette (France)

    2011-02-15

    Understanding how chromatin is spatially and dynamically organized in the nucleus of eukaryotic cells and how this affects genome functions is one of the main challenges of cell biology. Since the different orders of packaging in the hierarchical organization of DNA condition the accessibility of DNA sequence elements to trans-acting factors that control the transcription and replication processes, there is actually a wealth of structural and dynamical information to learn in the primary DNA sequence. In this review, we show that when using concepts, methodologies, numerical and experimental techniques coming from statistical mechanics and nonlinear physics combined with wavelet-based multi-scale signal processing, we are able to decipher the multi-scale sequence encoding of chromatin condensation-decondensation mechanisms that play a fundamental role in regulating many molecular processes involved in nuclear functions.

  8. On the Influence of the Extrinsic Information Scaling Coefficient on the Performance of Single and Double Binary Turbo Codes

    Directory of Open Access Journals (Sweden)

    BALTA, H.

    2013-05-01

    Full Text Available This paper presents a study on the influence of the extrinsic information scaling coefficient value (eic on the bit and frame error rate (BER/FER, for single and double binary turbo codes (S/DBTC decoded with maximum a posteriori (MAP and maximum logarithmic MAP (MaxLogMAP component algorithms. Firstly, we estimate the distance spectrum of the code with the so-called error impulse method (EIM, and we analyze its dependence as well as the dependence of the asymptotic FER on eic. Secondly, we estimate the actual FER using Monte Carlo simulations with eic as a parameter. The comparison of the FER(eic curves obtained by the two methods allows us, on the one hand, to assess the quality of the decoding algorithms, and on the other hand, to estimate the very low BER/FER performance of TCs, where the Monte Carlo method is practically unusable. The results presented also provide a practical guide for the appreciation of the optimal value of the scaling factor, eic. We may notice that also the MAP algorithm performance could be improved using eic<1.

  9. Signalign: An Ontology of DNA as Signal for Comparative Gene Structure Prediction Using Information-Coding-and-Processing Techniques.

    Science.gov (United States)

    Yu, Ning; Guo, Xuan; Gu, Feng; Pan, Yi

    2016-03-01

    Conventional character-analysis-based techniques in genome analysis manifest three main shortcomings-inefficiency, inflexibility, and incompatibility. In our previous research, a general framework, called DNA As X was proposed for character-analysis-free techniques to overcome these shortcomings, where X is the intermediates, such as digit, code, signal, vector, tree, graph network, and so on. In this paper, we further implement an ontology of DNA As Signal, by designing a tool named Signalign for comparative gene structure analysis, in which DNA sequences are converted into signal series, processed by modified method of dynamic time warping and measured by signal-to-noise ratio (SNR). The ontology of DNA As Signal integrates the principles and concepts of other disciplines including information coding theory and signal processing into sequence analysis and processing. Comparing with conventional character-analysis-based methods, Signalign can not only have the equivalent or superior performance, but also enrich the tools and the knowledge library of computational biology by extending the domain from character/string to diverse areas. The evaluation results validate the success of the character-analysis-free technique for improved performances in comparative gene structure prediction.

  10. Phylum-Level Conservation of Regulatory Information in Nematodes despite Extensive Non-coding Sequence Divergence.

    Directory of Open Access Journals (Sweden)

    Kacy L Gordon

    2015-05-01

    Full Text Available Gene regulatory information guides development and shapes the course of evolution. To test conservation of gene regulation within the phylum Nematoda, we compared the functions of putative cis-regulatory sequences of four sets of orthologs (unc-47, unc-25, mec-3 and elt-2 from distantly-related nematode species. These species, Caenorhabditis elegans, its congeneric C. briggsae, and three parasitic species Meloidogyne hapla, Brugia malayi, and Trichinella spiralis, represent four of the five major clades in the phylum Nematoda. Despite the great phylogenetic distances sampled and the extensive sequence divergence of nematode genomes, all but one of the regulatory elements we tested are able to drive at least a subset of the expected gene expression patterns. We show that functionally conserved cis-regulatory elements have no more extended sequence similarity to their C. elegans orthologs than would be expected by chance, but they do harbor motifs that are important for proper expression of the C. elegans genes. These motifs are too short to be distinguished from the background level of sequence similarity, and while identical in sequence they are not conserved in orientation or position. Functional tests reveal that some of these motifs contribute to proper expression. Our results suggest that conserved regulatory circuitry can persist despite considerable turnover within cis elements.

  11. Cross-layer combining of information-guided transmission withnetwork coding relaying for multiuser cognitive radio systems

    KAUST Repository

    Yang, Yuli

    2013-02-01

    For a cognitive radio relaying network, we propose a cross-layer design by combining information-guided transmission at the physical layer and network coding at the network layer. With this design, a common relay is exploited to help the communications between multiple secondary source-destination pairs, which allows for a more efficient use of the radio resources, and moreover, generates less interference to primary licensees in the network. Considering the spectrum-sharing constraints on the relay and secondary sources, the achievable data rate of the proposed cross-layer design is derived and evaluated. Numerical results on average capacity and uniform capacity in the network under study substantiate the efficiency of our proposed design. © 2013 IEEE.

  12. panMetaDocs and DataSync - providing a convenient way to share and publish research data

    Science.gov (United States)

    Ulbricht, D.; Klump, J. F.

    2013-12-01

    any XML-based metadata schema. To reduce manual entries of metadata to a minimum and make use of contextual information in a project setting, metadata fields can be populated with static or dynamic content. Access rights can be defined to control visibility and access to stored objects. Notifications about recently updated datasets are available by RSS and e-mail and the entire inventory can be harvested via OAI-PMH. panMetaDocs is optimized to be harvested by panFMP [4]. panMetaDocs is able to mint dataset DOIs though DataCite and uses eSciDocs' REST API to transfer eSciDoc-objects from a non-public 'pending'-status to the published status 'released', which makes data and metadata of the published object available worldwide through the internet. The application scenario presented here shows the adoption of open source applications to data sharing and publication of data. An eSciDoc-repository is used as storage for data and metadata. DataSync serves as a file ingester and distributor, whereas panMetaDocs' main function is to annotate the dataset files with metadata to make them ready for publication and sharing with your own team, or with the scientific community.

  13. QuickDOC: an interlibrary loan department in a microcomputer.

    Science.gov (United States)

    Klein, P; Hewison, N S

    1991-01-01

    Interlibrary loan (ILL) is a critical service in most libraries and one that may consume much staff time. QuickDOC, a software program written by a medical librarian, has been designed to expedite and organize the process of requesting loans, keeping records, and preparing reports on ILL activity. The software communicates with DOCLINE, the automated ILL and referral system of the National Library of Medicine (NLM), and simplifies the management of all borrowing and lending, regardless of how requests are transmitted. QuickDOC's creator is very responsive to suggestions from users and continues to enhance the capabilities of this excellent software package.

  14. Response of DOC in Acid-Sensitive Maine Lakes to Decreasing Sulfur Deposition (1993 - 2009)

    Science.gov (United States)

    Oelsner, G. P.; Sanclements, M.; McKnight, D. M.; Stoddard, J. L.

    2010-12-01

    In response to the Clean Air Act Amendments of 1990, sulfur deposition has decreased across the northeastern United States. As a result, sulfate concentrations in lakes and streams have also decreased and many surface waters have become less acidic. Over the same time period, there has been a concurrent increase in dissolved organic carbon (DOC) concentrations in many lakes and streams which has been difficult to interpret. To assess the biogeochemical processes driving increasing DOC concentrations we analyzed archived samples from 9 acid-sensitive lakes in Maine collected between 1993 and 2009 using UV-Vis and fluorescence spectroscopy. The fluorescence index (FI) was calculated for all samples. The FI represents the ratio of the emission intensity at 450 nm to 550 nm at an excitation wavelength of 370 nm and provides information regarding the source of dissolved organic matter (DOM). This index has a value of approximately 1.9 for microbially derived fluvic acids and a value of approximately 1.4 for terrestrially (higher-plant) derived fluvic acids. All four lakes with increasing DOC trends had concomitant decreases in the FI index. Two of five lakes with no significant DOC trend also demonstrated no trend in FI values over time, while three lakes revealed a decrease in FI values. To confirm that the FI measured in whole water was primarily reflective of fulvic acids (FA), XAD-resin was used to isolate FA from a subset of samples. Analysis of the FA indicates that the FI values for the humic substances are slightly higher, yet well correlated with whole water samples. This suggests that despite prolonged storage in plastic, the FI trends are meaningful. The FI trends suggest a terrestrial source for the increasing DOC and may be driven by increased DOM production from soils experiencing decreased acid loading. Decreases in sulfate deposition can increase soil pH and soil organic matter solubility, as well as decrease the ionic strength of the soil solution, and

  15. DocML: A Digital Library of University Data.

    Science.gov (United States)

    Papadakis, Ioannis; Karakoidas, Vassileios; Chrissikopoulos, Vassileios

    2002-01-01

    Describes DocML, a Web-based digital library of university data that is used to build a system capable of preserving and managing student assignments. Topics include requirements for a digital library of university data; metadata and XML; three-tier architecture; user interface; searching; browsing; content delivery; and administrative issues.…

  16. Solar ultraviolet photodegradation of DOC may stimulate freshwater food webs

    NARCIS (Netherlands)

    Lange, de H.J.; Morris, D.P.; Williamson, C.E.

    2003-01-01

    The UV component in solar radiation increased the availability of DOC for bacterial growth, and led to an increase in bacterial and bacterivore abundance in laboratory plankton cultures. UV radiation may thus stimulate ecosystem productivity by increasing dissolved organic carbon lability and

  17. Dissolved organic carbon (DOC) and coliform bacteria in ...

    African Journals Online (AJOL)

    September-November) of the year 1994 to find out if these untreated underground waters contain dissolved organic carbon (DOC) and coliform bracteria. Out of the 40 sites sampled during both the wet and dry seasons, 11 contained colifrorm ...

  18. ANALISIS STRATEGI PEMASARAN DOC PEDAGING PADA PT X UNIT BALI

    Directory of Open Access Journals (Sweden)

    BUDI RAHAYU TANAMA PUTRI

    2012-09-01

    Full Text Available Abstrak Penelitian dilakukan pada salah satu perusahaan yang bergerak di bidang pemasaran DOC (day old chicks pedaging pada makalah ini disebut PT X. Permasalahan yang dihadapi perusahaan adalah belum mampu menyusun strategi pemasaran yang tepat karena kurangnya informasi mengenai ”trend” dan jumlah permintaan DOC di Bali. Penelitian ini bertujuan untuk menganalisis kondisi internal dan eksternal yang dihadapi perusahaan dan menyusun rencana serta strategi yang tepat bagi pemasaran perusahaan. Penelitian ini dilaksanakan pada PT X unit Bali di Denpasar serta pengambilan data dari responden peternak dan pakar peternakan di Bali dengan metode survai. Design penelitian adalah riset deskriptif. Data yang diperoleh diolah dengan menggunakan analisis ”Internal Factor Evaluation - External Factor Evaluation” (IFE - EFE serta matriks Internal-Eksternal (IE. Hasil penelitian menunjukkan bahwa PT X Unit Bali berada pada sel IV, yang masuk ke dalam kelompok pertama yaitu strategi tumbuh dan bina. Formulasi strategi yang disarankan adalah : 1 melakukan riset pasar; 2 melakukan promosi penjualan dan 3 melakukan analisis kembali terhadap sistem dalam pola kemitraan yang telah diterapkan agar mampu menyerap lebih banyak DOC. ANALYSIS ON THE MARKETING STRATEGY OF DOC BROILER AT PT X OF BALI UNIT Abstract PT X of Bali Unit is one of the companies in marketing of broiler DOC (Day Old Chicks. In the high competition of marketing, the company has to apply the right strategy to increase their own market share. The aims of this study are analyzing the internal and external factors faced by the company, setting up and recommending appropriate plan and strategy for the marketing of DOC at PT X in Bali. The study was conducted at PT X of Bali Unit, and some data sampling has been taken from broiler chicken breeders and animal husbandry scientists in Bali. The internal and external factor analysis and IE Matrix were used in this study. Based on the internal and

  19. Assessing the accuracy of opioid overdose and poisoning codes in diagnostic information from electronic health records, claims data, and death records.

    Science.gov (United States)

    Green, Carla A; Perrin, Nancy A; Janoff, Shannon L; Campbell, Cynthia I; Chilcoat, Howard D; Coplan, Paul M

    2017-05-01

    The purpose of this study is to assess positive predictive value (PPV), relative to medical chart review, of International Classification of Diseases (ICD)-9/10 diagnostic codes for identifying opioid overdoses and poisonings. Data were obtained from Kaiser Permanente Northwest and Northern California. Diagnostic data from electronic health records, submitted claims, and state death records from Oregon, Washington, and California were linked. Individual opioid-related poisoning codes (e.g., 965.xx and X42), and adverse effects of opioids codes (e.g., E935.xx) combined with diagnoses possibly indicative of overdoses (e.g., respiratory depression), were evaluated by comparison with chart audits. Opioid adverse effects codes had low PPV to detect overdoses (13.4%) as assessed in 127 charts and were not pursued. Instead, opioid poisoning codes were assessed in 2100 individuals who had those codes present in electronic health records in the period between the years 2008 and 2012. Of these, 10/2100 had no available information and 241/2100 were excluded potentially as anesthesia-related. Among the 1849 remaining individuals with opioid poisoning codes, 1495 events were accurately identified as opioid overdoses; 69 were miscodes or misidentified, and 285 were opioid adverse effects, not overdoses. Thus, PPV was 81%. Opioid adverse effects or overdoses were accurately identified in 1780 of 1849 events (96.3%). Opioid poisoning codes have a predictive value of 81% to identify opioid overdoses, suggesting ICD opioid poisoning codes can be used to monitor overdose rates and evaluate interventions to reduce overdose. Further research to assess sensitivity, specificity, and negative predictive value are ongoing. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.

  1. COLWRIT – Collaborative Online Writing in Google Docs

    DEFF Research Database (Denmark)

    Andreasen, Lars Birch; Winther, Frederikke; Hanghøj, Thorkild

    The poster presents preliminary hypotheses and findings of an on-going research project at Aalborg University, Denmark, which explores university students’ uses of collaborative writing tools like Google Docs when doing collaborative project work. The research project has a special focus on the v......The poster presents preliminary hypotheses and findings of an on-going research project at Aalborg University, Denmark, which explores university students’ uses of collaborative writing tools like Google Docs when doing collaborative project work. The research project has a special focus...... on the various effects on the collaboration process of students’ various usage of the commenting functions of online writing tools. The poster received the Best Poster Award at the conference....

  2. Leaching of DOC, DN, and inorganic constituents from scrap tires.

    Science.gov (United States)

    Selbes, Meric; Yilmaz, Ozge; Khan, Abdul A; Karanfil, Tanju

    2015-11-01

    One concern for recycle and reuse of scrap tires is the leaching of tire constituents (organic and inorganic) with time, and their subsequent potential harmful impacts in environment. The main objective of this study was to examine the leaching of dissolved organic carbon (DOC), dissolved nitrogen (DN), and selected inorganic constituents from scrap tires. Different sizes of tire chips and crumb rubber were exposed to leaching solutions with pH's ranging from 3.0 to 10.0 for 28days. The leaching of DOC and DN were found to be higher for smaller size tire chips; however, the leaching of inorganic constituents was independent of the size. In general, basic pH conditions increased the leaching of DOC and DN, whereas acidic pH conditions led to elevated concentrations of metals. Leaching was minimal around the neutral pH values for all the monitored parameters. Analysis of the leaching rates showed that components associated with the rubbery portion of the tires (DOC, DN, zinc, calcium, magnesium, etc.) exhibited an initial rapid followed by a slow release. On the other hand, a constant rate of leaching was observed for iron and manganese, which are attributed to the metal wires present inside the tires. Although the total amounts that leached varied, the observed leaching rates were similar for all tire chip sizes and leaching solutions. Operation under neutral pH conditions, use of larger size tire chips, prewashing of tires, and removal of metal wires prior to application will reduce the impact of tire recycle and reuse. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Un archivio aperto della Pubblica Amministrazione: "SSPAL.DOC"

    OpenAIRE

    Marchitelli, Andrea; Antonelli, Lucia

    2008-01-01

    Dopo un sintetico panorama sulla gestione on line dei documenti delle pubbliche amministrazioni, l'articolo presenta innanzi tutto l’archivio istituzionale SSPAL.DOC della Scuola Superiore della Pubblica Amministrazione Locale, in cui sono contenuti materiali didattici, dispense, dossier, studi, ricerche e altra documentazione prodotta dalla Scuola. Vengono quindi presentati la struttura, i contenuti e le modalità di utilizzo di un repository che s’inserisce in un contesto nazionale con un nu...

  4. A review of the empirical evidence of the value of structuring and coding of clinical information within electronic health records for direct patient care

    Directory of Open Access Journals (Sweden)

    Dipak Kalra

    2013-05-01

    Full Text Available Background The case has historically been presented that structured and/or coded electronic health records (EHRs benefit direct patient care, but the evidence base for this is not well documented.Methods We searched for evidence of direct patient care value from the use of structured and/or coded information within EHRs. We interrogated nine international databases from 1990 to 2011. Value was defined using the Institute of Medicine’s six areas for improvement for healthcare systems: effectiveness, safety, patient-centredness, timeliness, efficiency and equitability. We included studies satisfying the Cochrane Effective Practice and Organisation of Care (EPOC group criteria.Results Of 5016 potentially eligible papers, 13 studies satisfied our criteria: 10 focused on effectiveness, with eight demonstrating potential for improved proxy and actual clinical outcomes if a structured and/or coded EHR was combined with alerting or advisory systems in a focused clinical domain. Three studies demonstrated improvement in safety outcomes. No studies were found reporting value in relation to patient-centredness, timeliness, efficiency or equitability.Conclusions We conclude that, to date, there has been patchy effort to investigate empirically the value from structuring and coding EHRs for direct patient care. Future investments in structuring and coding of EHRs should be informed by robust evidence as to the clinical scenarios in which patient care benefits may be realised.

  5. Occurrence and simulation of trihalomethanes in swimming pool water: A simple prediction method based on DOC and mass balance.

    Science.gov (United States)

    Peng, Di; Saravia, Florencia; Abbt-Braun, Gudrun; Horn, Harald

    2016-01-01

    Trihalomethanes (THM) are the most typical disinfection by-products (DBPs) found in public swimming pool water. DBPs are produced when organic and inorganic matter in water reacts with chemical disinfectants. The irregular contribution of substances from pool visitors and long contact time with disinfectant make the forecast of THM in pool water a challenge. In this work occurrence of THM in a public indoor swimming pool was investigated and correlated with the dissolved organic carbon (DOC). Daily sampling of pool water for 26 days showed a positive correlation between DOC and THM with a time delay of about two days, while THM and DOC didn't directly correlate with the number of visitors. Based on the results and mass-balance in the pool water, a simple simulation model for estimating THM concentration in indoor swimming pool water was proposed. Formation of THM from DOC, volatilization into air and elimination by pool water treatment were included in the simulation. Formation ratio of THM gained from laboratory analysis using native pool water and information from field study in an indoor swimming pool reduced the uncertainty of the simulation. The simulation was validated by measurements in the swimming pool for 50 days. The simulated results were in good compliance with measured results. This work provides a useful and simple method for predicting THM concentration and its accumulation trend for long term in indoor swimming pool water. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Assessing contribution of DOC from sediments to a drinking-water reservoir using optical profiling

    Science.gov (United States)

    Downing, Bryan D.; Bergamaschi, Brian A.; Evans, David G.; Boss, Emmanuel

    2008-01-01

    Understanding the sources of dissolved organic carbon (DOC) in drinking-water reservoirs is an important management issue because DOC may form disinfection by-products, interfere with disinfection, or increase treatment costs. DOC may be derived from a host of sources-algal production of DOC in the reservoir, marginal production of DOC from mucks and vascular plants at the margins, and sediments in the reservoir. The purpose of this study was to assess if release of DOC from reservoir sediments containing ferric chloride coagulant was a significant source of DOC to the reservoir. We examined the source-specific contributions of DOC using a profiling system to measure the in situ distribution of optical properties of absorption and fluorescence at various locations in the reservoir. Vertical optical profiles were coupled with discrete water samples measured in the laboratory for DOC concentration and optical properties: absorption spectra and excitation emission matrix spectra (EEMs). Modeling the in situ optical data permitted estimation of the bulk DOC profile in the reservoir as well as separation into source-specific contributions. Analysis of the source-specific profiles and their associated optical characteristics indicated that the sedimentary source of DOC to the reservoir is significant and that this DOC is labile in the reservoir. We conclude that optical profiling is a useful technique for understanding complex biogeochemical processes in a reservoir.

  7. Google Docs in an Out-of-Class Collaborative Writing Activity

    Science.gov (United States)

    Zhou, Wenyi; Simpson, Elizabeth; Domizi, Denise Pinette

    2012-01-01

    Google Docs, an online word processing application, is a promising tool for collaborative learning. However, many college instructors and students lack knowledge to effectively use Google Docs to enhance teaching and learning. Goals of this study include (1) assessing the effectiveness of using Google Docs in an out-of-class collaborative writing…

  8. Decoding of Cyclic Codes,

    Science.gov (United States)

    INFORMATION THEORY, *DECODING), (* DATA TRANSMISSION SYSTEMS , DECODING), STATISTICAL ANALYSIS, STOCHASTIC PROCESSES, CODING, WHITE NOISE, NUMBER THEORY, CORRECTIONS, BINARY ARITHMETIC, SHIFT REGISTERS, CONTROL SYSTEMS, USSR

  9. The Information Coding in the Time Structure of the Object of a Laser Pulse in an Optical Echo Processor

    Directory of Open Access Journals (Sweden)

    L. A. Nefediev

    2012-01-01

    Full Text Available The encoding of information in time intervals of an echelon of laser pulses of an object pulse in the optical echo processor is considered. The measures of information are introduced to describe the transformation of classical information in quantum information. It is shown that in the description of information transformation into quantum information, the most appropriate measure is a measure of quantum information based on the algorithmic information theory.

  10. COLWRIT – Collaborative Online Writing in Google Docs

    DEFF Research Database (Denmark)

    Andreasen, Lars Birch; Winther, Frederikke; Hanghøj, Thorkild

    2014-01-01

    Various online collaborative writing tools have emerged giving students new opportunities when co-producing texts. The aim of this work-in-progress paper is to present the preliminary hypotheses and findings of an on-going research project at Aalborg University, Denmark, which explores university...... students’ uses of collaborative writing tools like Google Docs when doing collaborative project work. The research project has a special focus on the various effects on the collaboration process of students’ various usage of the commenting functions of online writing tools....

  11. Doc of prophage P1 is inhibited by its antitoxin partner Phd through fold complementation

    DEFF Research Database (Denmark)

    Garcia-Pino, Abel; Christensen-Dalsgaard, Mikkel; Wyns, Lode

    2008-01-01

    Prokaryotic toxin-antitoxin modules are involved in major physiological events set in motion under stress conditions. The toxin Doc (death on curing) from the phd/doc module on phage P1 hosts the C-terminal domain of its antitoxin partner Phd (prevents host death) through fold complementation...... evolutionary origin for the phd/doc operon. Doc induces growth arrest of Escherichia coli cells in a reversible manner, by targeting the protein synthesis machinery. Moreover, Doc activates the endogenous E. coli RelE mRNA interferase but does not require this or any other known chromosomal toxin...

  12. PUBLICITE AUTOMOBILE: Impact des propositions de loi DOC 1909/001 & DOC 1910/001

    OpenAIRE

    Ozer, Pierre

    2009-01-01

    Ce rapport d'expertise réalisé à la demande de la Commission de la Santé publique, de l'Environnement et du Renouveau de la Société de la Chambre des Représentants de Belgique et présenté le 26 juin 2009 analyse la publicité automobile via [1] son impact indirect sur les émissions de CO2 du secteur transport; [2] son illégalité flagrante au regard de la Directive européenne 1999/94/CE concernant « la disponibilité d’informations sur la consommation de carburant et les émissions de CO2 à l’int...

  13. The Effect of Formal Policies and Informal Social Learning on Perceptions of Corporate Ethics: Actions Speak Louder than Codes.

    Science.gov (United States)

    Kronzon, Shirit

    2002-01-01

    Discusses unethical business conduct and corporate crime, focusing on workers who bend rules to achieve satisfactory performance and organizational goals. Describes research that investigated how two cues in an organizational setting, one legal (codes of conduct) and one social (company responses to ethical transgressions), affect how individuals…

  14. Informed Consent and Clinical Research Involving Children and Adolescents: Implications of the Revised APA Ethics Code and HIPAA

    Science.gov (United States)

    Fisher, Celia B.

    2004-01-01

    In 2003, 2 new sets of rules and regulations affecting the conduct of clinical research involving children and adolescents went into effect: the revised American Psychological Association's (APA) Ethical Principles of Psychologists and Code of Conduct (APA, 2002; effective June 1, 2003) and the Privacy Rule (45 CFR Part 160 and A and E of Part…

  15. PanMetaDocs - A tool for collecting and managing the long tail of "small science data"

    Science.gov (United States)

    Klump, J.; Ulbricht, D.

    2011-12-01

    In the early days of thinking about cyberinfrastructure the focus was on "big science data". Today, the challenge is not anymore to store several terabytes of data, but to manage data objects in a way that facilitates their re-use. Key to re-use by a user as a data consumer is proper documentation of the data. Also, data consumers need discovery metadata to find the data they need and they need descriptive metadata to be able to use the data they retrieved. Thus, data documentation faces the challenge to extensively and completely describe these objects, hold the items easily accessible at a sustainable cost level. However, data curation and documentation do not rank high in the everyday work of a scientist as a data producer. Data producers are often frustrated by being asked to provide metadata on their data over and over again, information that seemed very obvious from the context of their work. A challenge to data archives is the wide variety of metadata schemata in use, which creates a number of maintenance and design challenges of its own. PanMetaDocs addresses these issues by allowing an uploaded files to be described by more than one metadata object. PanMetaDocs, which was developed from PanMetaWorks, is a PHP based web application that allow to describe data with any xml-based metadata schema. Its user interface is browser based and was developed to collect metadata and data in collaborative scientific projects situated at one or more institutions. The metadata fields can be filled with static or dynamic content to reduce the number of fields that require manual entries to a minimum and make use of contextual information in a project setting. In the development of PanMetaDocs the business logic of panMetaWorks is reused, except for the authentication and data management functions of PanMetaWorks, which are delegated to the eSciDoc framework. The eSciDoc repository framework is designed as a service oriented architecture that can be controlled through a

  16. ES-doc-errata: an issue tracker platform for CMIP6

    Science.gov (United States)

    Ben Nasser, Atef; Levavasseur, Guillaume; Greenslade, Mark; Denvil, Sébastien

    2017-04-01

    In the context of overseeing the quality of data, and as a result of the inherent complexity of projects such as CMIP5/6, it is a mandatory task to keep track of the status of datasets and the version evolution they sustain in their life-cycle. The ESdoc-errata project aims to keep track of the issues affecting specific versions of datasets/files. It enables users to resolve the history tree of each dataset/file enabling a better choice of the data used in their work based on the data status. The ES-doc-errata project has been designed and built on top of the Parent-IDentifiers handle service that will be deployed in the next iteration of the CMIP project, by ensuring maximum usability of ESGF ecosystem and encapsulated in the ES-doc structure. Consuming PIDs from handle service is guided by a specifically built algorithm that extracts meta-data regarding the issues that may or may not affect the quality of datasets/files and cause newer version to be published replacing older deprecated versions. This algorithm is able to deduce the nature of the flaws to the file granularity, that is of high value to the end-user. This new platform has been designed keeping in mind usability by end-users specialized in the data publishing process or other scientists requiring feedback on reliability of data required for their work. To this end, a specific set of rules and a code of conduct has been defined. A validation process ensures the quality of this newly introduced errata meta-data , an authentication safe-guard was implemented to prevent tampering with the archived data, and a wide variety of tools were put at users disposal to interact safely with the platform including a command-line client and a dedicated front-end.

  17. ToxicDocs (www.ToxicDocs.org): from history buried in stacks of paper to open, searchable archives online.

    Science.gov (United States)

    Rosner, David; Markowitz, Gerald; Chowkwanyun, Merlin

    2018-02-01

    As a result of a legal mechanism called discovery, the authors accumulated millions of internal corporate and trade association documents related to the introduction of new products and chemicals into workplaces and commerce. What did these private entities discuss among themselves and with their experts? The plethora of documents, both a blessing and a curse, opened new sources and interesting questions about corporate and regulatory histories. But they also posed an almost insurmountable challenge to historians. Thus emerged ToxicDocs, possible only with a technological innovation known as "Big Data." That refers to the sheer volume of new digital data and to the computational power to analyze them. Users will be able to identify what firms knew (or did not know) about the dangers of toxic substances in their products-and when. The database opens many areas to inquiry including environmental studies, business history, government regulation, and public policy. ToxicDocs will remain a resource free and open to all, anywhere in the world.

  18. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    Science.gov (United States)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  19. Degradation potentials of dissolved organic carbon (DOC) from thawed permafrost peat

    Science.gov (United States)

    Panneer Selvam, Balathandayuthabani; Lapierre, Jean-François; Guillemette, Francois; Voigt, Carolina; Lamprecht, Richard E.; Biasi, Christina; Christensen, Torben R.; Martikainen, Pertti J.; Berggren, Martin

    2017-04-01

    Global warming can substantially affect the export of dissolved organic carbon (DOC) from peat-permafrost to aquatic systems. The direct degradability of such peat-derived DOC, however, is poorly constrained because previous permafrost thaw studies have mainly addressed mineral soil catchments or DOC pools that have already been processed in surface waters. We incubated peat cores from a palsa mire to compare an active layer and an experimentally thawed permafrost layer with regard to DOC composition and degradation potentials of pore water DOC. Our results show that DOC from the thawed permafrost layer had high initial degradation potentials compared with DOC from the active layer. In fact, the DOC that showed the highest bio- and photo-degradability, respectively, originated in the thawed permafrost layer. Our study sheds new light on the DOC composition of peat-permafrost directly upon thaw and suggests that past estimates of carbon-dioxide emissions from thawed peat permafrost may be biased as they have overlooked the initial mineralization potential of the exported DOC.

  20. Dynamics, chemical properties and bioavailability of DOC in an early successional catchment

    Directory of Open Access Journals (Sweden)

    U. Risse-Buhl

    2013-07-01

    Full Text Available The dynamics of dissolved organic carbon (DOC have been intensively studied in mature ecosystems, but little is known about DOC dynamics and the significance of DOC as a substrate for microbial activity in early-successional catchments. We determined the concentration, chemical composition, source, radiocarbon age, and bioavailability of DOC along the hydrological flow path from soil solution to a downstream pond in a recently constructed catchment (Chicken Creek Catchment, Germany. Soil solution, upwelling ground water, stream water, subsurface water in an alluvial fan, and pond water all had high DOC concentrations (averages: 6.0–11.6 mg DOC L–1, despite small carbon stocks in both vegetation and soil of the catchment. Solid-state CPMAS 13C NMR of DOC in upwelling ground water revealed a higher proportion of aromatic compounds (32% and a lower proportion of carbohydrates (33% than in pond water (18% and 45%, respectively. The average 14C age of DOC in upwelling ground water was 2600 to 2900 yr, while organic matter of the Quaternary substrate of the catchment had a 14C age of 3000 to 16 000 yr. Both the 14C age data and 13C NMR spectra suggest that DOC partly derived from organic matter of the Quaternary substrate (about 40 to 90% of the C in the DOC, indicating that both recent and old C of the DOC can support microbial activity during early ecosystem succession. However, in a 70 day incubation experiment, only about 11% of the total DOC was found to be bioavailable. This proportion was irrespective of the water type. Origin of the microbial communities within the catchment (enriched from soil, stream sediment or pond water also had only a marginal effect on overall DOC utilization.

  1. What Technology Skills Do Developers Need? A Text Analysis of Job Listings in Library and Information Science (LIS from Jobs.code4lib.org

    Directory of Open Access Journals (Sweden)

    Monica Maceli

    2015-09-01

    Full Text Available Technology plays an indisputably vital role in library and information science (LIS work; this rapidly moving landscape can create challenges for practitioners and educators seeking to keep pace with such change.  In pursuit of building our understanding of currently sought technology competencies in developer-oriented positions within LIS, this paper reports the results of a text analysis of a large collection of job listings culled from the Code4lib jobs website.  Beginning over a decade ago as a popular mailing list covering the intersection of technology and library work, the Code4lib organization's current offerings include a website that collects and organizes LIS-related technology job listings.  The results of the text analysis of this dataset suggest the currently vital technology skills and concepts that existing and aspiring practitioners may target in their continuing education as developers.

  2. Modelling impacts of atmospheric deposition and temperature on long-term DOC trends

    OpenAIRE

    Sawicka, K; Rowe, E. C.; Evans, C.D.; Monteith, D. T.; Vanguelova, E. I.; Wade, A. J.; Clark, J. M.

    2017-01-01

    It is increasingly recognised that widespread and substantial increases in Dissolved organic carbon (DOC) concentrations in remote surface, and soil, waters in recent decades are linked to declining acid deposition. Effects of rising pH and declining ionic strength on DOC solubility have been proposed as potential dominant mechanisms. However, since DOC in these systems is derived mainly from recently-fixed carbon, and since organic matter decomposition rates are considered sensitive to tempe...

  3. Inferring DOC export mechanisms from high-frequency, instream UV-VIS concentration measurements

    Science.gov (United States)

    Oosterwoud, Marieke; Musolff, Andreas; Keller, Toralf; Fleckenstein, Jan

    2015-04-01

    The flux of soil-derived dissolved organic carbon (DOC) is a significant term in terrestrial carbon budgets and, as a result, a dominant link between terrestrial and aquatic ecosystems. Concentrations of dissolved organic carbon in streams and rivers have been increasing in many parts of the world. Providers of drinking water from surface water reservoirs are increasingly facing problems as elevated DOC concentrations cause higher costs for removal and potentially to toxic by-products during chlorination. Mitigating these problems requires a mechanistic understanding of the controls and dynamics of DOC export from catchments. High frequency measurements using UV-vis absorbance as a proxy for DOC concentrations allow for improved evaluation of DOC concentration-discharge relationships in catchments. In addition, several UV-vis absorbance proxies (both single and multiple wavelength) can be used as an indicator of DOC quality. These relationships allow quantification of net DOC export, and may additionally provide new insights into the mechanisms that control DOC export dynamics. We aimed to evaluate the response and interaction of DOC concentrations and quality between a riparian zone soil and stream under different hydrological conditions. UV-vis sensors were installed in both the riparian soil and stream of two headwater catchments, the Hassel and Rappbode, in the Harz Mountains in Germany. The two headwater catchments are approximately equal in size, however, differ in their land-use. The Hassel catchment is dominated by agricultural land-use, whereas the Rappbode catchment is mainly forested. The DOC concentration-discharge relationships show intricate hysteretic behavior, which differs between locations and shifts in time. The rich data-set will allow for a characterization of space and time patterns of DOC export as well as changes in its quality, providing valuable new insights into the hydrologic mechanisms that govern the delivery of DOC to streams.

  4. Linking the terrestrial and aquatic system across scales: The role of connectivity, landscape organization and catchment size for the dynamics of DOC

    Science.gov (United States)

    Laudon, Hjalmar

    2014-05-01

    While the production and export of DOC - dissolved organic carbon - from the terrestrial landscape has been extensively studied during the past decades, mechanistic understanding of processes that control stream water quality at the soil/water interface, across different spatial scales, are still at its infancy. To improve the process description of DOC regulation, I use data and understanding from three decades of research that has been conducted within the interdisciplinary, multi-scale Krycklan Catchment Study (KCS) in northern Sweden (www.slu.se/Krycklan). KCS consists of 17 intensively long-term monitored catchments ranging over three orders of magnitude in size, from 3 ha to over 6780 ha, to elucidate the dominate hydrobiogeophysical processes regulating the concentration and export of nutrients, metals and organic pollutants. By combining the use of detailed catchment information with natural isotopes and the dynamics of stream biogeochemistry we can directly link variability in hydrological flow pathways, catchment characteristics and scale with the spatial and temporal dynamics of DOC. Our results suggest that the contrasting spatial variability in the flow pathways among the different landscape types has a first order control on the DOC. As a result, large variations in the dynamics of DOC and its quality are observed that varies with changes in hydrological connectivity, landscape organization and catchment size.

  5. Doc2b synchronizes secretion from chromaffin cells by stimulating fast and inhibiting sustained release

    DEFF Research Database (Denmark)

    da Silva Pinheiro, Paulo César; de Wit, Heidi; Walter, Alexander M

    2013-01-01

    Synaptotagmin-1 and -7 constitute the main calcium sensors mediating SNARE-dependent exocytosis in mouse chromaffin cells, but the role of a closely related calcium-binding protein, Doc2b, remains enigmatic. We investigated its role in chromaffin cells using Doc2b knock-out mice and high temporal...... resolution measurements of exocytosis. We found that the calcium dependence of vesicle priming and release triggering remained unchanged, ruling out an obligatory role for Doc2b in those processes. However, in the absence of Doc2b, release was shifted from the readily releasable pool to the subsequent...

  6. Cyclone Codes

    OpenAIRE

    Schindelhauer, Christian; Jakoby, Andreas; Köhler, Sven

    2016-01-01

    We introduce Cyclone codes which are rateless erasure resilient codes. They combine Pair codes with Luby Transform (LT) codes by computing a code symbol from a random set of data symbols using bitwise XOR and cyclic shift operations. The number of data symbols is chosen according to the Robust Soliton distribution. XOR and cyclic shift operations establish a unitary commutative ring if data symbols have a length of $p-1$ bits, for some prime number $p$. We consider the graph given by code sym...

  7. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  8. Sources for increased DOC-concentrations in the groundwater downstream of the landfill Hohne (DEA); Ursachen erhoehter DOC-Konzentrationen im Grundwasserabstrom am Beispiel der Deponie Hohne (DEA)

    Energy Technology Data Exchange (ETDEWEB)

    Bahlmann, E.; Seifert, R. [Hamburg Univ. (Germany). Inst. fuer Geologie; Eschenbach, A.; Kleinschmidt, V. [Hamburg Univ. (Germany). Inst. fuer Bodenkunde

    2017-08-15

    Construction waste together with drilling mud and oil-contaminated soil had been deposited in the landfill Hohne from 1971. Four groundwater monitoring sites had been installed: one monitoring site upstream and three sites downstream of the landfill in three different directions. Downstream of the landfill increased concentrations of chloride, sulphate, sodium and DOC (dissolved organic carbon) had been measured over a period of years. Particularly the source of the DOC has remained unclear. Assumptions were (i) leaking of contaminants from the landfill and degradation under the landfill by microbes or plants or (ii) leaching of DOC from the soil under the landfill caused by a change in the redox potential. The determination of the DOC source was the major subject of this study.

  9. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  10. American Red Cross Digital Operations Center (DigiDOC): an essential emergency management tool for the digital age.

    Science.gov (United States)

    Markenson, David; Howe, Laura

    2014-10-01

    Social media is becoming the first source of information and also the first way to communicate messages. Because social media users will take action based on the information they are seeing, it is important that organizations like the Red Cross be active in the social space. We describe the American Red Cross's concept for a Digital Operations Center (DigiDOC) that we believe should become an essential part of all emergency operations centers and a key piece of all agencies that operate in disasters. The American Red Cross approach is a practical and logical approach that other agencies can use as a model.

  11. Correction: Re: Acknowledgment: A Profile of Coding Staff in Sydney Metropolitan Public Hospitals Health Information Management, Vol 32(2).

    Science.gov (United States)

    McIntosh, Jean; Dimitropoulos, Vera; Bramley, Michelle

    2004-05-01

    The authors would like to thank Adam Bennett, who collected the raw data used in this study for his thesis submitted for the degree of Bachelor of Applied Science (Health Information Management) (Honours) at The University of Sydney.

  12. C-MORE Professional Development Training Program for Graduate Students and Post-Docs

    Science.gov (United States)

    Bruno, B. C.; DeLeo, F.; Bottjer, D.; Jungbluth, S.; Burkhardt, B.; Hawco, N.; Boiteau, R.

    2012-12-01

    The Center for Microbial Oceanography: Research and Education (C-MORE) is a National Science Foundation-sponsored Science and Technology Center. C-MORE comprises six partner institutions: University of Hawaii (headquarters), Massachusetts Institute of Technology, Woods Hole Oceanographic Institution, Oregon State University, University of California at Santa Cruz and Monterey Bay Aquarium Research Institute. C-MORE's Professional Development Training Program is aimed at equipping graduate students and post-docs at all six institutions with the skills and experiences needed to maximize their potential and succeed in their professional careers. This program is administered through the C-MORE Education Office and was developed in close collaboration with graduate students, post-docs, and faculty. This program has formal but flexible requirements. There is only one required module (Outreach). The seven optional modules include: Science Communication, Leadership, Mentoring, Teaching, Research Exchange, Diversity and Proposal Writing. Masters students choose three optional modules; Ph.D. students and post-docs choose five. Most modules consist of a training component, followed by a practical component. All participants will are expected to complete program evaluations. Below are some examples of program offerings: Science Communication Module In partnership with the Communication Partnership for Science and the Sea, C-MORE organized three Science Communication workshops at the University of Hawaii, Monterey Bay Aquarium Research Institute and Massachusetts Institute of Technology. These workshops train participants to distill their research into language that is free of jargon and accessible to a general audience. After the training, participants are asked to produce a communication product based on their research, such as a magazine article, press release, podcast or a blog. Diversity Module To date, C-MORE has organized three teleconferences on diversity, attended by

  13. Doc'CISMEF: a search tool based on "encapsulated" MeSH thesaurus.

    Science.gov (United States)

    Darmoni, S J; Thirion, B; Leroy, J P; Douyère, M; Lacoste, B; Godard, C; Rigolle, I; Brisou, M; Videau, S; Goupy, E; Piot, J; Quéré, M; Ouazir, S; Abdulrab, H

    2001-01-01

    In the year 2000, the Internet became a major source of health information for the health professional and the Netizen. The objective of Doc'CISMeF (D'C) was to create a powerful generic search tool based on an structured information model which â encapsulates' the MeSH thesaurus to index and retrieve quality health resources on the Internet. To index resources, D'C uses four sections in its information model: 'meta-term', keyword, subheading, and resource type. Two search options are available: simple and advanced. The simple search requires the end-user to input a single term or expression. If this term belongs to the D'C information structure model, it will be exploded. If not, a full-text search is performed. In the advanced search, complex searches are possible combining Boolean operators with meta-terms, keywords, subheadings and resource types. D'C uses two standard tools for organising information: the MeSH thesaurus and the Dublin Core metadata format. Resources included in D'C are described according to the following elements: title, author or creator, subject and keywords, description, publisher, date, resource type, format, identifier, and language.

  14. code {poems}

    Directory of Open Access Journals (Sweden)

    Ishac Bertran

    2012-08-01

    Full Text Available "Exploring the potential of code to communicate at the level of poetry," the code­ {poems} project solicited submissions from code­writers in response to the notion of a poem, written in a software language which is semantically valid. These selections reveal the inner workings, constitutive elements, and styles of both a particular software and its authors.

  15. OntoADR a semantic resource describing adverse drug reactions to support searching, coding, and information retrieval.

    Science.gov (United States)

    Souvignet, Julien; Declerck, Gunnar; Asfari, Hadyl; Jaulent, Marie-Christine; Bousquet, Cédric

    2016-10-01

    Efficient searching and coding in databases that use terminological resources requires that they support efficient data retrieval. The Medical Dictionary for Regulatory Activities (MedDRA) is a reference terminology for several countries and organizations to code adverse drug reactions (ADRs) for pharmacovigilance. Ontologies that are available in the medical domain provide several advantages such as reasoning to improve data retrieval. The field of pharmacovigilance does not yet benefit from a fully operational ontology to formally represent the MedDRA terms. Our objective was to build a semantic resource based on formal description logic to improve MedDRA term retrieval and aid the generation of on-demand custom groupings by appropriately and efficiently selecting terms: OntoADR. The method consists of the following steps: (1) mapping between MedDRA terms and SNOMED-CT, (2) generation of semantic definitions using semi-automatic methods, (3) storage of the resource and (4) manual curation by pharmacovigilance experts. We built a semantic resource for ADRs enabling a new type of semantics-based term search. OntoADR adds new search capabilities relative to previous approaches, overcoming the usual limitations of computation using lightweight description logic, such as the intractability of unions or negation queries, bringing it closer to user needs. Our automated approach for defining MedDRA terms enabled the association of at least one defining relationship with 67% of preferred terms. The curation work performed on our sample showed an error level of 14% for this automated approach. We tested OntoADR in practice, which allowed us to build custom groupings for several medical topics of interest. The methods we describe in this article could be adapted and extended to other terminologies which do not benefit from a formal semantic representation, thus enabling better data retrieval performance. Our custom groupings of MedDRA terms were used while performing signal

  16. Intelligent information loss: the coding of facial identity, head pose, and non-face information in the macaque face patch system.

    Science.gov (United States)

    Meyers, Ethan M; Borzello, Mia; Freiwald, Winrich A; Tsao, Doris

    2015-05-06

    Faces are a behaviorally important class of visual stimuli for primates. Recent work in macaque monkeys has identified six discrete face areas where most neurons have higher firing rates to images of faces compared with other objects (Tsao et al., 2006). While neurons in these areas appear to have different tuning (Freiwald and Tsao, 2010; Issa and DiCarlo, 2012), exactly what types of information and, consequently, which visual behaviors neural populations within each face area can support, is unknown. Here we use population decoding to better characterize three of these face patches (ML/MF, AL, and AM). We show that neural activity in all patches contains information that discriminates between the broad categories of face and nonface objects, individual faces, and nonface stimuli. Information is present in both high and lower firing rate regimes. However, there were significant differences between the patches, with the most anterior patch showing relatively weaker representation of nonface stimuli. Additionally, we find that pose-invariant face identity information increases as one moves to more anterior patches, while information about the orientation of the head decreases. Finally, we show that all the information we can extract from the population is present in patterns of activity across neurons, and there is relatively little information in the total activity of the population. These findings give new insight into the representations constructed by the face patch system and how they are successively transformed. Copyright © 2015 the authors 0270-6474/15/357069-13$15.00/0.

  17. Modelling impacts of atmospheric deposition and temperature on long-term DOC trends

    NARCIS (Netherlands)

    Sawicka, Kasia; Rowe, E.C.; Evans, C.D.; Monteith, D.T.; Vanguelova, E.I.; Wade, A.J.; Clark, J.M.

    2017-01-01

    It is increasingly recognised that widespread and substantial increases in Dissolved organic carbon (DOC) concentrations in remote surface, and soil, waters in recent decades are linked to declining acid deposition. Effects of rising pH and declining ionic strength on DOC solubility have been

  18. Implementing Gmail Docs and Blogs for Enhancing Motivation towards Writing in English

    Science.gov (United States)

    Gomez Zapata, Julian Esteban

    2010-01-01

    This action research paper dealt with how to increase motivation towards writing in English through blogs and Gmail docs in a private school in Medellín, Colombia. It was necessary to explore the concepts of "social interaction," "motivation" and "reasons for writing" to understand how blogs and Gmail docs favored…

  19. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  20. Development of an Information Delivery Manual for Early Stage BIM-based Energy Performance Assessment and Code Compliance as a Part of DGNB Pre-Certification

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Romanska, Iva; Stamenov, Martin

    2017-01-01

    for all parties involved. However, the persistent lack of early collaboration and process standardization prevent reaching the full potential of BIM-based performance evaluation. By following buildingSMART’s methodology for development of Information Delivery Manual/Model View Definition, this paper...... presents a framework for BIM-based energy performance assessment and code compliance, as required by the Danish Building Regulations and the DGNB rating system. Standardization of the information exchange would increase efficiency and reduce manual data input, duplication of work and errors due......The evolvement of integrated practices utilizing Building Performance Simulations has made it possible to address the growing needs of the building design. Furthermore, including a sustainability rating system in the early stages ensures a superior environmental performance and a common goal...

  1. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Directory of Open Access Journals (Sweden)

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  2. The Languages of Neurons: An Analysis of Coding Mechanisms by Which Neurons Communicate, Learn and Store Information

    Directory of Open Access Journals (Sweden)

    Morris H. Baslow

    2009-11-01

    Full Text Available In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron uses a simple mechanism for transmitting information. This is in the form of temporal electrophysiological action potentials or spikes (S operating on a millisecond timescale that, along with pauses (P between spikes constitute a two letter “alphabet” that generates meaningful frequency-encoded signals or neuronal S/P “words” in a primary language. However, when a word from an afferent neuron enters the dendritic-synaptic-dendritic field between two neurons, it is translated into a new frequency-encoded word with the same meaning, but in a different spike-pause language, that is delivered to and understood by the efferent neuron. It is suggested that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function in that it allows for variations in meaning to occur. Thus, structural or biochemical changes in dendrites or synapses can produce novel words in the second language that have changed meanings, allowing for a specific signaling experience, either external or internal, to modify the meaning of an original word (learning, and store the learned information of that experience (memory in the form of an altered dendritic-synaptic-dendritic field.

  3. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  4. Characterizing adsorptive properties and DOC concentrations in soils of Northern European Russian tundra and taiga.

    Science.gov (United States)

    Oosterwoud, Marieke; Temminghoff, Erwin; van der Zee, Sjoerd

    2010-05-01

    Subarctic river basins have an enormous potential to mobilize and transport terrestrial OC to the Arctic Ocean, because 23-48% of the worlds soils organic carbon (SOC) is stored in the high latitude region. Currently the Arctic drainage basin (~24 x 106 km2) processes about 11% of the global dissolved organic carbon (DOC), which is exported to the ocean. About 10-25% of annual C input to the organic surface layer with litter is leached from the organic surface layers. As climate changes, the amount and chemical composition of DOC exported from these basins are expected to change. Adsorption of DOC on mineral phases is the key geochemical process for the release and removal of DOC from this potentially soluble carbon pool. Most DOC leached from organic horizons is adsorbed and retained in the subsoils. The adsorption depends much on the content of sesquioxides and amount of carbon previously accumulated in soils. Besides adsorption, polyvalent metal ions in solution, such as Al and Ca, can cause precipitation of DOC. Along with the decrease of DOC concentrations on its passage through mineral soil, there are major biochemical alterations of DOC composition. Hydrophobic compounds (humic and fulvic acids) of high molecular weight that are rich in acidic functional groups and aromatic compounds adsorb most strongly. Hydrophilic compounds can contribute to DOC adsorption but are also easily desorbed because of the weaker bonding strength. The aim of this study was to characterize the DOC concentrations and their chemical composition as well as the DOC adsorptive properties of soils found in a tundra and taiga catchment of Northern Russia. We sampled soil and soil solution from two catchments in the Komi Republic of European Northern Russia: a tundra (67N/62E) and a taiga (62N/50E). The soil samples were analysed for total organic carbon (Ct) and the content of sequioxides. By extracting soil samples with water we got an impression of the potentially extractable organic

  5. Modelling impacts of temperature, and acidifying and eutrophying deposition on DOC trends

    Science.gov (United States)

    Sawicka, Kasia; Rowe, Ed; Evans, Chris; Monteith, Don; Vanguelova, Elena; Wade, Andrew; Clark, Joanna

    2017-04-01

    Surface water dissolved organic carbon (DOC) concentrations in large parts of the northern hemisphere have risen over the past three decades, raising concern about enhanced contributions of carbon to the atmosphere and seas and oceans. The effect of declining acid deposition has been identified as a key control on DOC trends in soil and surface waters, since pH and ionic strength affect sorption and desorption of DOC. However, since DOC is derived mainly from recently-fixed carbon, and organic matter decomposition rates are considered sensitive to temperature, uncertainty persists regarding the extent to the relative importance of different drivers that affect these upward trends. We ran the dynamic model MADOC (Model of Acidity and Soil Organic Carbon) for a range of UK soils (podzols, gleysols and peatland), for which the time-series were available, to consider the likely relative importance of decreased deposition of sulphate and chloride, accumulation of reactive N, and higher temperatures, on DOC production in different soils. Modelled patterns of DOC change generally agreed favourably with measurements collated over 10-20 years, but differed markedly between sites. While the acidifying effect of sulphur deposition appeared to be the predominant control on the observed soil water DOC trends in all the soils considered other than a blanket peat, the model suggested that over the long term, the effects of nitrogen deposition on N-limited soils may have been sufficient to elevate the DOC recovery trajectory significantly. The second most influential cause of rising DOC in the model simulations was N deposition in ecosystems that are N-limited and respond with stimulated plant growth. Although non-marine chloride deposition made some contribution to acidification and recovery, it was not amongst the main drivers of DOC change. Warming had almost no effect on modelled historic DOC trends, but may prove to be a significant driver of DOC in future via its influence

  6. Sharing code.

    Science.gov (United States)

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  7. Divergence coding for convolutional codes

    Directory of Open Access Journals (Sweden)

    Valery Zolotarev

    2017-01-01

    Full Text Available In the paper we propose a new coding/decoding on the divergence principle. A new divergent multithreshold decoder (MTD for convolutional self-orthogonal codes contains two threshold elements. The second threshold element decodes the code with the code distance one greater than for the first threshold element. Errorcorrecting possibility of the new MTD modification have been higher than traditional MTD. Simulation results show that the performance of the divergent schemes allow to approach area of its effective work to channel capacity approximately on 0,5 dB. Note that we include the enough effective Viterbi decoder instead of the first threshold element, the divergence principle can reach more. Index Terms — error-correcting coding, convolutional code, decoder, multithreshold decoder, Viterbi algorithm.

  8. The effect of drought on dissolved organic carbon (DOC release from peatland soil and vegetation sources

    Directory of Open Access Journals (Sweden)

    J. P. Ritson

    2017-06-01

    Full Text Available Drought conditions are expected to increase in frequency and severity as the climate changes, representing a threat to carbon sequestered in peat soils. Downstream water treatment works are also at risk of regulatory compliance failures and higher treatment costs due to the increase in riverine dissolved organic carbon (DOC often observed after droughts. More frequent droughts may also shift dominant vegetation in peatlands from Sphagnum moss to more drought-tolerant species. This paper examines the impact of drought on the production and treatability of DOC from four vegetation litters (Calluna vulgaris, Juncus effusus, Molinia caerulea and Sphagnum spp. and a peat soil. We found that mild droughts caused a 39.6 % increase in DOC production from peat and that peat DOC that had been exposed to oxygen was harder to remove by conventional water treatment processes (coagulation/flocculation. Drought had no effect on the amount of DOC production from vegetation litters; however large variation was observed between typical peatland species (Sphagnum and Calluna and drought-tolerant grassland species (Juncus and Molinia, with the latter producing more DOC per unit weight. This would therefore suggest the increase in riverine DOC often observed post-drought is due entirely to soil microbial processes and DOC solubility rather than litter layer effects. Long-term shifts in species diversity may, therefore, be the most important impact of drought on litter layer DOC flux, whereas pulses related to drought may be observed in peat soils and are likely to become more common in the future. These results provide evidence in support of catchment management which increases the resilience of peat soils to drought, such as ditch blocking to raise water tables.

  9. The effect of drought on dissolved organic carbon (DOC) release from peatland soil and vegetation sources

    Science.gov (United States)

    Ritson, Jonathan P.; Brazier, Richard E.; Graham, Nigel J. D.; Freeman, Chris; Templeton, Michael R.; Clark, Joanna M.

    2017-06-01

    Drought conditions are expected to increase in frequency and severity as the climate changes, representing a threat to carbon sequestered in peat soils. Downstream water treatment works are also at risk of regulatory compliance failures and higher treatment costs due to the increase in riverine dissolved organic carbon (DOC) often observed after droughts. More frequent droughts may also shift dominant vegetation in peatlands from Sphagnum moss to more drought-tolerant species. This paper examines the impact of drought on the production and treatability of DOC from four vegetation litters (Calluna vulgaris, Juncus effusus, Molinia caerulea and Sphagnum spp.) and a peat soil. We found that mild droughts caused a 39.6 % increase in DOC production from peat and that peat DOC that had been exposed to oxygen was harder to remove by conventional water treatment processes (coagulation/flocculation). Drought had no effect on the amount of DOC production from vegetation litters; however large variation was observed between typical peatland species (Sphagnum and Calluna) and drought-tolerant grassland species (Juncus and Molinia), with the latter producing more DOC per unit weight. This would therefore suggest the increase in riverine DOC often observed post-drought is due entirely to soil microbial processes and DOC solubility rather than litter layer effects. Long-term shifts in species diversity may, therefore, be the most important impact of drought on litter layer DOC flux, whereas pulses related to drought may be observed in peat soils and are likely to become more common in the future. These results provide evidence in support of catchment management which increases the resilience of peat soils to drought, such as ditch blocking to raise water tables.

  10. Modelling impacts of atmospheric deposition and temperature on long-term DOC trends.

    Science.gov (United States)

    Sawicka, K; Rowe, E C; Evans, C D; Monteith, D T; E I Vanguelova; Wade, A J; J M Clark

    2017-02-01

    It is increasingly recognised that widespread and substantial increases in Dissolved organic carbon (DOC) concentrations in remote surface, and soil, waters in recent decades are linked to declining acid deposition. Effects of rising pH and declining ionic strength on DOC solubility have been proposed as potential dominant mechanisms. However, since DOC in these systems is derived mainly from recently-fixed carbon, and since organic matter decomposition rates are considered sensitive to temperature, uncertainty persists over the extent to which other drivers that could influence DOC production. Such potential drivers include fertilisation by nitrogen (N) and global warming. We therefore ran the dynamic soil chemistry model MADOC for a range of UK soils, for which time series data are available, to consider the likely relative importance of decreased deposition of sulphate and chloride, accumulation of reactive N, and higher temperatures, on soil DOC production in different soils. Modelled patterns of DOC change generally agreed favourably with measurements collated over 10-20years, but differed markedly between sites. While the acidifying effect of sulphur deposition appeared to be the predominant control on the observed soil water DOC trends in all the soils considered other than a blanket peat, the model suggested that over the long term, the effects of nitrogen deposition on N-limited soils may have been sufficient to raise the "acid recovery DOC baseline" significantly. In contrast, reductions in non-marine chloride deposition and effects of long term warming appeared to have been relatively unimportant. The suggestion that future DOC concentrations might exceed preindustrial levels as a consequence of nitrogen pollution has important implications for drinking water catchment management and the setting and pursuit of appropriate restoration targets, but findings still require validation from reliable centennial-scale proxy records, such as those being developed

  11. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  12. Indices for Testing Neural Codes

    OpenAIRE

    Jonathan D. Victor; Nirenberg, Sheila

    2008-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is ...

  13. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech......; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...

  14. Organic amendments' dissolved organic carbon influences bioavailability of agricultural soil DOC

    Science.gov (United States)

    Straathof, Angela L.; Chincarini, Riccardo; Hoffland, Ellis; Comans, Rob N. J.

    2013-04-01

    Agricultural soils benefit from additions of organic amendments because they improve soil structure, are a source of plant nutrients, and increase concentrations of soil organic carbon (SOC). The latter fuels microbial processes important for plant growth, including nutrient mineralization and the suppression of plant diseases. However, these amendment additions range in quality and quantity of C and little is known about how their properties interact with native soil C and affect turnover. The dissolved pool of SOC (DOC) may be the most important C source for these processes as it is more biologically available and thus relatively easily turned over by the soil microbial biomass. Using a rapid-batch DOC fractionation procedure, we studied the composition of different organic amendments' DOC pools and measured how their additions change the quantity and turnover of soil DOC. Fractions isolated and quantified with this procedure include humic and fulvic acids, hydrophobic neutral and hydrophilic compounds. We hypothesized that these range from biologically recalcitrant to readily available, respectively. Amendments analysed included composts of different source materials and maturation stages collected from two different compost facilities in the Netherlands. Both total DOC concentrations and proportions of the aforementioned fractions ranged highly between composts. Composts cured for >10 days had a lower proportion of hydrophilic C compounds, suggesting that these are the most bioavailable and released as CO2 via microbial activity during maturation. To measure the effects of compost DOC on soil DOC, we extracted the former and added it to a sandy soil in an incubation experiment. The amendment increased soil total DOC, CO2 production from the soil, and the pools of humic and fulvic acids as a proportion of total DOC. Turnover of C from the incubated soil was measured by substrate-induced CO2 production (an indicator of microbial activity) from a 96-well

  15. The Dimensional Obsessive-Compulsive Scale: Development and Validation of a Short Form (DOCS-SF)

    OpenAIRE

    Eilertsen, Thomas; Hansen, Bjarne; Kvale, Gerd; Jonathan S. Abramowitz; Holm, Silje E. H.; Solem, Stian

    2017-01-01

    Accurately and reliably measuring the presence and severity of Obsessive-Compulsive Disorder (OCD) symptoms is essential for both routine clinical work and research. The current study investigated psychometric properties of the dimensional obsessive-compulsive scale-short form (DOCS-SF). DOCS-SF was developed and validated in Norwegian. DOCS-SF contains a checklist with four symptom categories and five severity items scored on a zero to eight scale yielding a total score of 0?40. Data were co...

  16. The Dimensional Obsessive-Compulsive Scale: Development and Validation of a Short Form (DOCS-SF)

    OpenAIRE

    Thomas Eilertsen; Bjarne Hansen; Gerd Kvale; Jonathan S. Abramowitz; Holm, Silje E. H.; Stian Solem

    2017-01-01

    Accurately and reliably measuring the presence and severity of Obsessive-Compulsive Disorder (OCD) symptoms is essential for both routine clinical work and research. The current study investigated psychometric properties of the dimensional obsessive-compulsive scale-short form (DOCS-SF). DOCS-SF was developed and validated in Norwegian. DOCS-SF contains a checklist with four symptom categories and five severity items scored on a zero to eight scale yielding a total score of 0–40. Data were co...

  17. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...

  18. Continuous speech recognition with sparse coding

    CSIR Research Space (South Africa)

    Smit, WJ

    2009-04-01

    Full Text Available Sparse coding is an efficient way of coding information. In a sparse code most of the code elements are zero; very few are active. Sparse codes are intended to correspond to the spike trains with which biological neurons communicate. In this article...

  19. Time-Varying Space-Only Codes for Coded MIMO

    CERN Document Server

    Duyck, Dieter; Takawira, Fambirai; Boutros, Joseph J; Moeneclaey, Marc

    2012-01-01

    Multiple antenna (MIMO) devices are widely used to increase reliability and information bit rate. Optimal error rate performance (full diversity and large coding gain), for unknown channel state information at the transmitter and for maximal rate, can be achieved by approximately universal space-time codes, but comes at a price of large detection complexity, infeasible for most practical systems. We propose a new coded modulation paradigm: error-correction outer code with space-only but time-varying precoder (as inner code). We refer to the latter as Ergodic Mutual Information (EMI) code. The EMI code achieves the maximal multiplexing gain and full diversity is proved in terms of the outage probability. Contrary to most of the literature, our work is not based on the elegant but difficult classical algebraic MIMO theory. Instead, the relation between MIMO and parallel channels is exploited. The theoretical proof of full diversity is corroborated by means of numerical simulations for many MIMO scenarios, in te...

  20. Google Docs as a Tool for Collaborative Writing in the Middle School Classroom

    National Research Council Canada - National Science Library

    Yanan Fan; Megan P Woodrich

    2017-01-01

    .... To be exact, the paper discusses whether student participation in anonymous collaborative writing via Google Docs can lead to more successful products in a linguistically diverse eighth-grade English...

  1. Addiction Drug Underused by Primary Care Docs in U.S.

    Science.gov (United States)

    ... news/fullstory_167585.html Addiction Drug Underused by Primary Care Docs in U.S. Buprenorphine is prescribed to get ... opioid use disorder that's approved for prescription by primary care physicians, allowing treatment in the privacy of a ...

  2. Variable relationships of DOC with oxygen in the northwestern Indian Ocean and their ecological implications

    Digital Repository Service at National Institute of Oceanography (India)

    Rajendran, A.; DileepKumar, M.; Ramaiah, N.; Ittekkot, V.; Desai, B.N.

    The relationships between DOC and AOU in the northwestern Indian Ocean regional variations reflecting the different biological characteristics dominanting the respective zones, resulting in the variable percentages of DOM respiration through nitrate...

  3. Emergency medicine residents' beliefs about contributing to a Google DocsTM presentation: a survey protocol

    Directory of Open Access Journals (Sweden)

    Patrick Archambault

    2011-07-01

    Conclusion To our knowledge, this study will be the first to use a theory based framework to identify healthcare trainees' salient beliefs concerning their decision whether to contribute to an online collaborative writing project using Google DocsTM.

  4. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  5. Coding labour

    National Research Council Canada - National Science Library

    McCosker, Anthony; Milne, Esther

    2014-01-01

    ... software. Code encompasses the laws that regulate human affairs and the operation of capital, behavioural mores and accepted ways of acting, but it also defines the building blocks of life as DNA...

  6. Quantifying tropical peatland dissolved organic carbon (DOC) using UV-visible spectroscopy.

    Science.gov (United States)

    Cook, Sarah; Peacock, Mike; Evans, Chris D; Page, Susan E; Whelan, Mick J; Gauci, Vincent; Kho, Lip Khoon

    2017-05-15

    UV-visible spectroscopy has been shown to be a useful technique for determining dissolved organic carbon (DOC) concentrations. However, at present we are unaware of any studies in the literature that have investigated the suitability of this approach for tropical DOC water samples from any tropical peatlands, although some work has been performed in other tropical environments. We used water samples from two oil palm estates in Sarawak, Malaysia to: i) investigate the suitability of both single and two-wavelength proxies for tropical DOC determination; ii) develop a calibration dataset and set of parameters to calculate DOC concentrations indirectly; iii) provide tropical researchers with guidance on the best spectrophotometric approaches to use in future analyses of DOC. Both single and two-wavelength model approaches performed well with no one model significantly outperforming the other. The predictive ability of the models suggests that UV-visible spectroscopy is both a viable and low cost method for rapidly analyzing DOC in water samples immediately post-collection, which can be important when working at remote field sites with access to only basic laboratory facilities. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. 'Dancing on a thin line': evaluation of an infant feeding information team to implement the WHO code of marketing of breast-milk substitutes.

    Science.gov (United States)

    Dykes, Fiona; Richardson-Foster, Helen; Crossland, Nicola; Thomson, Gill

    2012-12-01

    to conduct an in-depth evaluation of the Infant Feeding Information Team (IFIT) to implement the WHO Code of Marketing of Breast-milk Substitutes in North West England. The evaluation included consultations with inter-disciplinary professionals to explore their perceptions of the IFIT and related contextual issues. a qualitative, descriptive study involving seven focus groups (n=34) and semi-structured, in-depth interviews (face to face or via telephone; n=68) with a total of 102 participants. Thematic networks analysis was conducted to generate global, organising and basic themes. two maternity/primary health-care facilities located in the North-West of England. six global themes were generated; this paper focuses upon one of these themes: 'Dancing on a thin line'. This reflects the difficulties health-care staff face in negotiating political, professional and socio-cultural influences on infant feeding practices and how they struggle to implement best available evidence, guidance and practice when they experience incomplete, conflicting and competing messages around infant feeding. IFIT offers an innovative means to sustain contact with the formula industry without their unprecedented access to health facilities or personnel. Focused training opportunities should be provided to enable health-care staff to appreciate the constituent limitations of artificial milks and provide consistent, sensitive and comprehensive infant feeding information. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Beam-dynamics codes used at DARHT

    Energy Technology Data Exchange (ETDEWEB)

    Ekdahl, Jr., Carl August [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  9. Developing palaeolimnological records of organic content (DOC and POC) using the UK Acid Water Monitoring Network sites

    Science.gov (United States)

    Russell, Fiona; Chiverrell, Richard; Boyle, John

    2016-04-01

    observed trends in DOC of surface waters. Analysis of these cores and various calibration materials (e.g. peat) suggests plant tissue undergoes pyrolysis at lower temperatures, and though humic substances can be generated in the lake this thermal phase may be a proxy record for catchment derived DOC. NIR and FTIR spectrometry data further characterise this organic phase, identify spectral structures that also correlate with monitored DOC. Together the pyrolysis, NIR, FTIR and XRF geochemistry (e.g. Fe/Mn, Si/Al ratios) data show also information on lake productivity, biogenic silica and mass accumulation rates. To explore the longer timescale equivalent proxy records have been trialled at Llyn Cwm Mynach and show possible phases of elevated DOC fluxes from catchment soils during the Holocene. References Evans C.D., Monteith D.T. and Cooper D.M. 2005. Long-term increases in surface water dissolved organic carbon: Observations, possible causes and environmental impacts. Environ. Pollut. 137: 55-71. Jonsson M., Ranaker L., Nilsson P.A. and Bronmark C. 2012. Prey-type-dependent foraging of young-of-the-year fish in turbid and humic environments. Ecol. Freshw. Fish 21: 461-468. Monteith D.T., Stoddard J.L., Evans C.D., de Wit H.A., Forsius M., Hogasen T., Wilander A., Skjelkvale B.L., Jeffries D.S., Vuorenmaa J., Keller B., Kopacek J. and Vesely J. 2007. Dissolved organic carbon trends resulting from changes in atmospheric deposition chemistry. Nature 450: 537-U539. Ranaker L., Jonsson M., Nilsson P.A. and Bronmark C. 2012. Effects of brown and turbid water on piscivore-prey fish interactions along a visibility gradient. Freshwater Biol. 57: 1761-1768. Tuvendal M. and Elmqvist T. 2011. Ecosystem Services Linking Social and Ecological Systems: River Brownification and the Response of Downstream Stakeholders. Ecol. Soc. 16

  10. Regulation of stream water dissolved organic carbon (DOC concentrations during snowmelt; the role of discharge, winter climate and memory effects

    Directory of Open Access Journals (Sweden)

    A. Ågren

    2010-09-01

    Full Text Available Using a 15 year stream record from a northern boreal catchment, we demonstrate that the inter-annual variation in dissolved organic carbon (DOC concentrations during snowmelt was related to discharge, winter climate and previous DOC export. A short and intense snowmelt gave higher stream water DOC concentrations, as did long winters, while a high previous DOC export during the antecedent summer and autumn resulted in lower concentrations during the following spring. By removing the effect of discharge we could detect that the length of winter affected the modeled soil water DOC concentrations during the following snowmelt period, which in turn affected the concentrations in the stream. Winter climate explained more of the stream water DOC variations than previous DOC export during the antecedent summer and autumn.

  11. JCSC_129_02_0193_0202_SI.doc

    Indian Academy of Sciences (India)

    SUPPLEMENTARY INFORMATION. Study of behaviour of Ni(III) macrocyclic complexes in acidic aqueous medium through kinetic measurement involving hydrogen peroxide oxidation and DFT calculations. ANURADHA SANKARAN,a,b E J PADMA MALARc,* and VENKATAPURAM RAMANUJAM VIJAYARAGHAVANa,*.

  12. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  13. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  14. A systematic examination of the relationships between CDOM and DOC in inland waters in China

    Science.gov (United States)

    Song, Kaishan; Zhao, Ying; Wen, Zhidan; Fang, Chong; Shang, Yingxin

    2017-10-01

    Chromophoric dissolved organic matter (CDOM) plays a vital role in the biogeochemical cycle in aquatic ecosystems. The relationship between CDOM and dissolved organic carbon (DOC) has been investigated, and this significant relationship lays the foundation for the estimation of DOC using remotely sensed imagery data. The current study examined samples from freshwater lakes, saline lakes, rivers and streams, urban water bodies, and ice-covered lakes in China for tracking the variation of the relationships between DOC and CDOM. The regression model slopes for DOC vs. aCDOM (275) ranged from extremely low 0.33 (highly saline lakes) to 1.03 (urban waters) and 3.01 (river waters). The low values were observed in saline lake waters and waters from semi-arid or arid regions, where strong photobleaching is expected due to less cloud cover, longer water residence time, and daylight hours. In contrast, high values were found in waters developed in wetlands or forest in Northeast China, where more organic matter was transported from catchment to waters. The study also demonstrated that closer relationships between CDOM and DOC were revealed when aCDOM (275) were sorted by the ratio of aCDOM(250)/aCDOM (365), which is a measure for the CDOM absorption with respect to its composition, and the determination of coefficient of the regression models ranged from 0.79 to 0.98 for different groups of waters. Our results indicate the relationships between CDOM and DOC are variable for different inland waters; thus, models for DOC estimation through linking with CDOM absorption need to be tailored according to water types.

  15. A systematic examination of the relationships between CDOM and DOC in inland waters in China

    Directory of Open Access Journals (Sweden)

    K. Song

    2017-10-01

    Full Text Available Chromophoric dissolved organic matter (CDOM plays a vital role in the biogeochemical cycle in aquatic ecosystems. The relationship between CDOM and dissolved organic carbon (DOC has been investigated, and this significant relationship lays the foundation for the estimation of DOC using remotely sensed imagery data. The current study examined samples from freshwater lakes, saline lakes, rivers and streams, urban water bodies, and ice-covered lakes in China for tracking the variation of the relationships between DOC and CDOM. The regression model slopes for DOC vs. aCDOM (275 ranged from extremely low 0.33 (highly saline lakes to 1.03 (urban waters and 3.01 (river waters. The low values were observed in saline lake waters and waters from semi-arid or arid regions, where strong photobleaching is expected due to less cloud cover, longer water residence time, and daylight hours. In contrast, high values were found in waters developed in wetlands or forest in Northeast China, where more organic matter was transported from catchment to waters. The study also demonstrated that closer relationships between CDOM and DOC were revealed when aCDOM (275 were sorted by the ratio of aCDOM(250∕aCDOM (365, which is a measure for the CDOM absorption with respect to its composition, and the determination of coefficient of the regression models ranged from 0.79 to 0.98 for different groups of waters. Our results indicate the relationships between CDOM and DOC are variable for different inland waters; thus, models for DOC estimation through linking with CDOM absorption need to be tailored according to water types.

  16. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  17. Accuracy of clinical data entry when using a computerized decision support system: a case study with OncoDoc2.

    Science.gov (United States)

    Séroussi, Brigitte; Blaszka-Jaulerry, Brigitte; Zelek, Laurent; Lefranc, Jean-Pierre; Conforti, Rosa; Spano, Jean-Philippe; Rousseau, Alexandra; Bouaud, Jacques

    2012-01-01

    Some studies suggest that the implementation of health information technology (HIT) introduces unpredicted and unintended consequences including e-iatrogenesis. OncoDoc2 is a guideline-based clinical decision support system (CDSS) applied to the management of breast cancer. The system is used by answering closed-ended questions in order to document patient data while navigating through the knowledge base until the best patient-specific recommended treatments are obtained. OncoDoc2 has been used by three hospitals in real clinical settings and for genuine patients. We analysed 394 navigations, recorded on a 10-month period, which correspond to 6,025 data entries. The data entry error rate is 4.2%, spread over 52% of incorrect navigations (N-). However, the overall compliance rate of clinical decisions with guidelines significantly increased from 72.8% (without CDSS) to 87.3% (with CDSS). Although this increase is lowered because of N- navigations (compliance rates are respectively 95% and 80% for N+ and N- navigations), the benefits of HIT outweighted its disadvantages in our study.

  18. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    Science.gov (United States)

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  19. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  20. ECAJS 2009 VOL 14 No 1 FINAL EDIT doc

    African Journals Online (AJOL)

    user

    Information collected included age, gender, cause and severity of injury, the time interval between injury Glasgow Coma ... Only 12% of patients with traumatic brain injury (TBI) had CT scan.. A total of .... brain injury and spinal trauma accounted for 49.3% and 23.7% of all neurosurgical admissions respectively and 67.67% ...

  1. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  2. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621. Keywords.

  3. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  4. Influenza Prevention: Information for Travelers

    Science.gov (United States)

    ... Virus Testing Clinical Signs & Symptoms of Influenza Symptoms & Laboratory Diagnosis Information for Clinicians on Rapid Diagnostic Testing for ... Help: How do I view different file formats (PDF, DOC, PPT, MPEG) on this site? Adobe PDF ...

  5. Investigating DOC export dynamics using high-frequency instream concentration measurements

    Science.gov (United States)

    Oosterwoud, Marieke; Keller, Toralf; Musolff, Andreas; Frei, Sven; Park, Ji-Hyung; Fleckenstein, Jan H.

    2014-05-01

    Being able to monitor DOC concentrations using in-situ high frequency measurements makes it possible to better understand concentration-discharge behavior under different hydrological conditions. We developed a UV-Vis probe setup for modified/adapted use under field conditions. The quasi mobile probe setup allows a more flexible probe deployment. New or existing monitoring sites can easily be equipped for quasi-continuous monitoring or measurements can be performed at changing locations, without the need for additional infrastructure. We were able to gather high frequency data on DOC dynamics for one year in two streams in the Harz mountains in Germany. It proved that obtaining accurate DOC concentrations from the UV-Vis probes required frequent maintenance and probe calibration. The advantage of the setup over standard monitoring protocols becomes evident when comparing net exports over a year. In addition to mass improved balance calculations the high-frequency measurements can reveal intricate hysteretic relationships between discharge and concentrations that can provide valuable insights into the hydrologic dynamics and mechanisms that govern the delivery of DOC to the receiving waters. Measurements with similar probes from two additional catchments in Southern Germany and South Korea will be used to illustrate different discharge-concentration relationships and what can be learned from them about the hydrologic mechanisms that control the dynamics of DOC export.

  6. Spatial and Seasonal Variation of Dissolved Organic Carbon (DOC) Concentrations in Irish Streams: Importance of Soil and Topography Characteristics

    Science.gov (United States)

    Liu, Wen; Xu, Xianli; McGoff, Nicola M.; Eaton, James M.; Leahy, Paul; Foley, Nelius; Kiely, Gerard

    2014-05-01

    Dissolved organic carbon (DOC) concentrations have increased in many sites in Europe and North America in recent decades. High DOC concentrations can damage the structure and functions of aquatic ecosystems by influencing water chemistry. This study investigated the spatial and seasonal variation of DOC concentrations in Irish streams across 55 sites at seven time occasions over 1 year (2006/2007). The DOC concentrations ranged from 0.9 to 25.9 mg/L with a mean value of 6.8 and a median value of 5.7 mg/L and varied significantly over the course of the year. The DOC concentrations from late winter (February: 5.2 ± 3.0 mg/L across 55 sites) and early spring (April: 4.5 ± 3.5 mg/L) had significantly lower DOC concentrations than autumn (October: mean 8.3 ± 5.6 mg/L) and early winter (December: 8.3 ± 5.1 mg/L). The DOC production sources (e.g., litterfall) or the accumulation of DOC over dry periods might be the driving factor of seasonal change in Irish stream DOC concentrations. Analysis of data using stepwise multiple linear regression techniques identified the topographic index (TI, an indication of saturation-excess runoff potential) and soil conditions (organic carbon content and soil drainage characteristics) as key factors in controlling DOC spatial variation in different seasons. The TI and soil carbon content (e.g., soil organic carbon; peat occurrence) are positively related to DOC concentrations, while well-drained soils are negatively related to DOC concentrations. The knowledge of spatial and seasonal variation of DOC concentrations in streams and their drivers are essential for optimum riverine water resources management.

  7. Automatisierte Artikelbestellverwaltung: Doctor-Doc – ein bibliothekarisches Verwaltungswerkzeug / Automation in Interlibrary Loan: Doctor-Doc – a tool for librarians

    Directory of Open Access Journals (Sweden)

    Fischer, Markus

    2010-05-01

    Full Text Available Interlibrary loan has always been an important service to supplement own library holdings.To organize and standardize the order process of journal articles for 6 hospitals, we did create an online tool for the Solothurner Spitäler AG. The resulting application is available for libraries free of charge under http://www.doctor-doc.com/. The application is maintained and will be further developed by an association founded specially for this purpose. Doctor-Doc is not a supplier of articles, but rather a platform to organize orders at existing suppliers like Subito, British Library or any other supplying libraries. Doctor-Doc is OpenURL compliant and is able to resolve identifiers like PMIDs. In combination with an existing account from the german EZB, libraries can use the application as a linkresolver.The application has become an essential tool to efficiently manage interlibrary loan for the Solothurner Spitäler AG. The tool is also used by many libraries in Germany and Switzerland.

  8. On the weight distribution of convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H

    2005-01-01

    Detailed information about the weight distribution of a convolutional code is given by the adjacency matrix of the state diagram associated with a minimal realization of the code. We will show that this matrix is an invariant of the code. Moreover, it will be proven that codes with the same

  9. Polar Codes

    Science.gov (United States)

    2014-12-01

    added by the decoder is K/ρ+Td. By the last assumption, Td and Te are both ≤ K/ρ, so the total latency added is between 2K/ρ and 4K /ρ. For example...better resolution near the decision point. Reference [12] showed that in decoding a (1024, 512) polar code, using 6-bit LLRs resulted in per- formance

  10. Natural diet of coral-excavating sponges consists mainly of dissolved organic carbon (DOC.

    Directory of Open Access Journals (Sweden)

    Benjamin Mueller

    Full Text Available Coral-excavating sponges are the most important bioeroders on Caribbean reefs and increase in abundance throughout the region. This increase is commonly attributed to a concomitant increase in food availability due to eutrophication and pollution. We therefore investigated the uptake of organic matter by the two coral-excavating sponges Siphonodictyon sp. and Cliona delitrix and tested whether they are capable of consuming dissolved organic carbon (DOC as part of their diet. A device for simultaneous sampling of water inhaled and exhaled by the sponges was used to directly measure the removal of DOC and bacteria in situ. During a single passage through their filtration system 14% and 13% respectively of the total organic carbon (TOC in the inhaled water was removed by the sponges. 82% (Siphonodictyon sp.; mean ± SD; 13 ± 17 μmol L(-1 and 76% (C. delitrix; 10 ± 12 μmol L(-1 of the carbon removed was taken up in form of DOC, whereas the remainder was taken up in the form of particulate organic carbon (POC; bacteria and phytoplankton despite high bacteria retention efficiency (72 ± 15% and 87 ± 10%. Siphonodictyon sp. and C. delitrix removed DOC at a rate of 461 ± 773 and 354 ± 562 μmol C h(-1 respectively. Bacteria removal was 1.8 ± 0.9 × 10(10 and 1.7 ± 0.6 × 10(10 cells h(-1, which equals a carbon uptake of 46.0 ± 21.2 and 42.5 ± 14.0 μmol C h(-1 respectively. Therefore, DOC represents 83 and 81% of the TOC taken up by Siphonodictyon sp. and C. delitrix per hour. These findings suggest that similar to various reef sponges coral-excavating sponges also mainly rely on DOC to meet their carbon demand. We hypothesize that excavating sponges may also benefit from an increasing production of more labile algal-derived DOC (as compared to coral-derived DOC on reefs as a result of the ongoing coral-algal phase shift.

  11. Natural diet of coral-excavating sponges consists mainly of dissolved organic carbon (DOC).

    Science.gov (United States)

    Mueller, Benjamin; de Goeij, Jasper M; Vermeij, Mark J A; Mulders, Yannick; van der Ent, Esther; Ribes, Marta; van Duyl, Fleur C

    2014-01-01

    Coral-excavating sponges are the most important bioeroders on Caribbean reefs and increase in abundance throughout the region. This increase is commonly attributed to a concomitant increase in food availability due to eutrophication and pollution. We therefore investigated the uptake of organic matter by the two coral-excavating sponges Siphonodictyon sp. and Cliona delitrix and tested whether they are capable of consuming dissolved organic carbon (DOC) as part of their diet. A device for simultaneous sampling of water inhaled and exhaled by the sponges was used to directly measure the removal of DOC and bacteria in situ. During a single passage through their filtration system 14% and 13% respectively of the total organic carbon (TOC) in the inhaled water was removed by the sponges. 82% (Siphonodictyon sp.; mean ± SD; 13 ± 17 μmol L(-1)) and 76% (C. delitrix; 10 ± 12 μmol L(-1)) of the carbon removed was taken up in form of DOC, whereas the remainder was taken up in the form of particulate organic carbon (POC; bacteria and phytoplankton) despite high bacteria retention efficiency (72 ± 15% and 87 ± 10%). Siphonodictyon sp. and C. delitrix removed DOC at a rate of 461 ± 773 and 354 ± 562 μmol C h(-1) respectively. Bacteria removal was 1.8 ± 0.9 × 10(10) and 1.7 ± 0.6 × 10(10) cells h(-1), which equals a carbon uptake of 46.0 ± 21.2 and 42.5 ± 14.0 μmol C h(-1) respectively. Therefore, DOC represents 83 and 81% of the TOC taken up by Siphonodictyon sp. and C. delitrix per hour. These findings suggest that similar to various reef sponges coral-excavating sponges also mainly rely on DOC to meet their carbon demand. We hypothesize that excavating sponges may also benefit from an increasing production of more labile algal-derived DOC (as compared to coral-derived DOC) on reefs as a result of the ongoing coral-algal phase shift.

  12. Use of geographic information systems technology to track critical health code violations in retail facilities available to populations of different socioeconomic status and demographics.

    Science.gov (United States)

    Darcey, Valerie L; Quinlan, Jennifer J

    2011-09-01

    Research shows that community socioeconomic status (SES) predicts, based on food service types available, whether a population has access to healthy food. It is not known, however, if a relationship exists between SES and risk for foodborne illness (FBI) at the community level. Geographic information systems (GIS) give researchers the ability to pinpoint health indicators to specific geographic locations and detect resulting environmental gradients. It has been used extensively to characterize the food environment, with respect to access to healthy foods. This research investigated the utility of GIS in determining whether community SES and/or demographics relate to access to safe food, as measured by food service critical health code violations (CHV) as a proxy for risk for FBI. Health inspection records documenting CHV for 10,859 food service facilities collected between 2005 and 2008 in Philadelphia, PA, were accessed. Using an overlay analysis through GIS, CHV were plotted over census tracts of the corresponding area. Census tracts (n = 368) were categorized into quintiles, based on poverty level. Overall, food service facilities in higher poverty areas had a greater number of facilities (with at least one CHV) and had more frequent inspections than facilities in lower poverty areas. The facilities in lower poverty areas, however, had a higher average number of CHV per inspection. Analysis of CHV rates in census tracts with high concentrations of minority populations found Hispanic facilities had more CHV than other demographics, and Hispanic and African American facilities had fewer days between inspections. This research demonstrates the potential for utilization of GIS mapping for tracking risks for FBI. Conversely, it sheds light on the subjective nature of health inspections, and indicates that underlying factors might be affecting inspection frequency and identification of CHV, such that CHV might not be a true proxy for risk for FBI.

  13. Investigations of freezing and cold storage for the analysis of peatland dissolved organic carbon (DOC) and absorbance properties.

    Science.gov (United States)

    Peacock, Mike; Freeman, Chris; Gauci, Vincent; Lebron, Inma; Evans, Chris D

    2015-07-01

    Although measured rates of biological degradation of DOC are typically low under dark conditions, it is assumed that water samples must be analysed soon after collection to provide an accurate measure of DOC concentration and UV-visible absorbance. To examine the impact of storage on DOC quality and quantity, we took water samples from an ombrotrophic peatland, and stored them in the dark at 4 °C for 138-1082 days. A median of 29% of DOC was lost during storage, but losses of absorbance at 254 nm were less. DOC loss followed a first-order exponential decay function, and was dependent on storage time. DOC half-life was calculated as 1253 days. Specific absorbance at 254 nm suggested that samples containing more aromatic DOC were more resistant to degradation, although time functioned as the primary control. Samples from two fens showed that loss of absorbance was greater at 400 nm rather than 254 nm, after 192 days storage, suggesting that non-aromatic DOC is preferentially degraded. These results suggest that samples can be stored for several months before losses of DOC become detectable, and that it is possible to back-calculate initial DOC concentrations in long-term stored samples based on known decay rates. Freeze/thaw experiments using samples from a range of peatlands suggested that DOC concentration was mostly unaffected by the process, but DOC increased 37% in one sample. Freezing had unpredictable and sometimes strong effects on absorbance, SUVA and E ratios, therefore freezing is not recommended as a method of preservation for these analyses.

  14. State Title I Migrant Participation Information, 1999-2000. Doc # 2003-9

    Science.gov (United States)

    Daft, Julie

    2004-01-01

    States use Migrant Education Program (MEP) funds to ensure that migrant children are provided with appropriate services that address the special needs caused by the effects of continual educational disruption. MEP services are usually delivered by schools, districts and/or other public or private organizations and can be instructional (reading,…

  15. Testing seasonal and long-term controls of streamwater DOC using empirical and process-based models.

    Science.gov (United States)

    Futter, Martyn N; de Wit, Heleen A

    2008-12-15

    Concentrations of dissolved organic carbon (DOC) in surface waters are increasing across Europe and parts of North America. Several mechanisms have been proposed to explain these increases including reductions in acid deposition, change in frequency of winter storms and changes in temperature and precipitation patterns. We used two modelling approaches to identify the mechanisms responsible for changing surface water DOC concentrations. Empirical regression analysis and INCA-C, a process-based model of stream-water DOC, were used to simulate long-term (1986--2003) patterns in stream water DOC concentrations in a small boreal stream. Both modelling approaches successfully simulated seasonal and inter-annual patterns in DOC concentration. In both models, seasonal patterns of DOC concentration were controlled by hydrology and inter-annual patterns were explained by climatic variation. There was a non-linear relationship between warmer summer temperatures and INCA-C predicted DOC. Only the empirical model was able to satisfactorily simulate the observed long-term increase in DOC. The observed long-term trends in DOC are likely to be driven by in-soil processes controlled by SO4(2-) and Cl(-) deposition, and to a lesser extent by temperature-controlled processes. Given the projected changes in climate and deposition, future modelling and experimental research should focus on the possible effects of soil temperature and moisture on organic carbon production, sorption and desorption rates, and chemical controls on organic matter solubility.

  16. Convolutional-Code-Specific CRC Code Design

    OpenAIRE

    Lou, Chung-Yu; Daneshrad, Babak; Wesel, Richard D.

    2015-01-01

    Cyclic redundancy check (CRC) codes check if a codeword is correctly received. This paper presents an algorithm to design CRC codes that are optimized for the code-specific error behavior of a specified feedforward convolutional code. The algorithm utilizes two distinct approaches to computing undetected error probability of a CRC code used with a specific convolutional code. The first approach enumerates the error patterns of the convolutional code and tests if each of them is detectable. Th...

  17. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  18. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  19. Concatenated codes with convolutional inner codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Zyablov, Viktor

    1988-01-01

    The minimum distance of concatenated codes with Reed-Solomon outer codes and convolutional inner codes is studied. For suitable combinations of parameters the minimum distance can be lower-bounded by the product of the minimum distances of the inner and outer codes. For a randomized ensemble...... of concatenated codes a lower bound of the Gilbert-Varshamov type is proved...

  20. The Icelandic version of the dimensional obsessive compulsive scale (DOCS) and its relationship with obsessive beliefs

    NARCIS (Netherlands)

    Ólafsson, R.P.; Arngrímsson, J.B.; Árnason, P.; Kolbeinsson, Þ; Emmelkamp, P.M.G.; Kristjánsson, A.; Ólason, D.Þ.

    2013-01-01

    The Dimensional Obsessive Compulsive Scale (DOCS) is a self-report instrument to assess severity of OC symptoms along four thematically distinct symptom dimensions. This may carry benefits; both in assessment and for studying the link between OC related beliefs and symptoms. The validity and factor

  1. Using Project-Based Learning and Google Docs to Support Diversity

    Science.gov (United States)

    Leh, Amy

    2014-01-01

    A graduate course, ETEC543 ("Technology and Learning I"), was revised to better serve increasing new student population, international students, in an academic program. Project-based learning, Google Docs, and instructional strategies fostering diversity and critical thinking were incorporated into the course redesign. Observations,…

  2. Four-year-BS-position-paper.pdf | misc docs | academy | Indian ...

    Indian Academy of Sciences (India)

    Home; academy; misc docs; Four-year-BS-position-paper.pdf. 404! error. The page your are looking for can not be found! Please check the link or use the navigation bar at the top. YouTube · Twitter · Facebook · Blog. Academy News. IAS Logo. Good Governance: Contribution of Vigilance · Vigilance Awareness Talk Talk by ...

  3. Exploring the Roles of Google.doc and Peer e-Tutors in English Writing

    Science.gov (United States)

    Lin, Wen-Chuan; Yang, Shu Ching

    2013-01-01

    This study explored college students' experiences with and perceptions of integrating both the Google.doc and peer e-tutors into an English writing course. This socio-cultural study employed online collaborative learning mechanisms with an attempt to develop students' English writing skills and motivation over the course of one year. Participants…

  4. TOC/DOC: "It Has Changed the Way I Do Science".

    Science.gov (United States)

    Douglas, Kimberly; Roth, Dana L.

    1997-01-01

    Describes a user-based automated service developed at the California Institute of Technology that combines access to journal article databases with an in-house document delivery system. TOC/DOC (Tables of Contents/Document Delivery) has undergone a conceptual change from a catalog of locally-held journal articles to a broader, more retrospective…

  5. Production and decomposition of new DOC by marine plankton communities: carbohydrates, refractory components and nutrient limitation

    DEFF Research Database (Denmark)

    Kragh, T.; Søndergaard, Morten

    2009-01-01

    The accumulation and biodegradation of dissolved organic carbon (DOC) and dissolved and particulate combined neutral sugars (DCNS, PCNS) were followed during a period of 22 days in experimental marine phytoplankton incubations. Five different growth regimes were established in 11 m(3) coastal...

  6. CTM4DOC : Electronic structure analysis from X-ray spectroscopy

    NARCIS (Netherlands)

    Delgado-Jaime, Mario Ulises; Zhang, Kaili; Vura-Weis, Josh; De Groot, Frank M F

    2016-01-01

    Two electronic structure descriptions, one based on orbitals and the other based on term symbols, have been implemented in a new Matlab-based program, CTM4DOC. The program includes a graphical user interface that allows the user to explore the dependence of details of electronic structure in

  7. The Dimensional Obsessive-Compulsive Scale: Development and Validation of a Short Form (DOCS-SF)

    Science.gov (United States)

    Eilertsen, Thomas; Hansen, Bjarne; Kvale, Gerd; Abramowitz, Jonathan S.; Holm, Silje E. H.; Solem, Stian

    2017-01-01

    Accurately and reliably measuring the presence and severity of Obsessive-Compulsive Disorder (OCD) symptoms is essential for both routine clinical work and research. The current study investigated psychometric properties of the dimensional obsessive-compulsive scale-short form (DOCS-SF). DOCS-SF was developed and validated in Norwegian. DOCS-SF contains a checklist with four symptom categories and five severity items scored on a zero to eight scale yielding a total score of 0–40. Data were collected from adults with a current diagnosis of OCD (n = 204) and a community comparison group (n = 211). The results provided evidence of internal consistency and convergent validity, although evidence for discriminant validity was mixed. Evidence was also found for diagnostic sensitivity and specificity, and treatment sensitivity. The analyses suggested a cut-off score of 16. In summary, the data obtained proved similar to studies published on the original dimensional obsessive-compulsive scale. There is strong evidence for the reliability and validity of the DOCS-SF for assessing OCD symptoms in individuals with this condition and in non-clinical individuals. PMID:28928693

  8. The Dimensional Obsessive-Compulsive Scale: Development and Validation of a Short Form (DOCS-SF

    Directory of Open Access Journals (Sweden)

    Thomas Eilertsen

    2017-09-01

    Full Text Available Accurately and reliably measuring the presence and severity of Obsessive-Compulsive Disorder (OCD symptoms is essential for both routine clinical work and research. The current study investigated psychometric properties of the dimensional obsessive-compulsive scale-short form (DOCS-SF. DOCS-SF was developed and validated in Norwegian. DOCS-SF contains a checklist with four symptom categories and five severity items scored on a zero to eight scale yielding a total score of 0–40. Data were collected from adults with a current diagnosis of OCD (n = 204 and a community comparison group (n = 211. The results provided evidence of internal consistency and convergent validity, although evidence for discriminant validity was mixed. Evidence was also found for diagnostic sensitivity and specificity, and treatment sensitivity. The analyses suggested a cut-off score of 16. In summary, the data obtained proved similar to studies published on the original dimensional obsessive-compulsive scale. There is strong evidence for the reliability and validity of the DOCS-SF for assessing OCD symptoms in individuals with this condition and in non-clinical individuals.

  9. The centrality of meta-programming in the ES-DOC eco-system

    Science.gov (United States)

    Greenslade, Mark

    2017-04-01

    The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.

  10. Patterns in DOC Concentration and Composition in Tundra Watersheds in the Kolyma River Basin

    Science.gov (United States)

    Behnke, M. I.; Schade, J. D.; Fiske, G. J.; Whittinghill, K. A.; Zimov, N.

    2014-12-01

    Much of the world's soil carbon is frozen in permafrost in the Arctic. As the climate warms and permafrost thaws, this carbon will again be actively cycled. Whether it is exported to the ocean or released as greenhouse gases to the atmosphere depends on the form of carbon compounds and conditions encountered during transport, and will determine the strength of permafrost thaw as a feedback on climate change. To better understand the fate of this carbon, we determined how and where in the landscape dissolved organic carbon (DOC) breaks down as water transports it from tundra to ocean. We compared DOC concentration and composition along flowpaths within watersheds and at the mouths of watersheds differing in drainage area. We incubated filtered water samples in light and dark, including filter-sterilized samples, to assess the interactions between light and microbial processing as mechanisms of DOC loss. Composition was assessed using optical measurements associated with the structure of organic compounds. DOC concentration declined along flowpaths within watersheds, with most loss occurring in aquatic environments high in the landscape. We also found a negative correlation between watershed size and DOC concentration. These results suggest that much of the processing of organic carbon occurs in small streams. In addition, the relationship with drainage area suggests that residence time in streams has a large impact on transformation of terrestrial carbon during transport. We found no substantial differences in optical characteristics of DOC, indicating that breakdown processes were not selective, and that light caused much of the breakdown. This conclusion is supported by the incubation experiment, which showed greater breakdown by light, and evidence that light stimulated higher rates of microbial processing. These results highlight the importance of inland aquatic ecosystems as processors of organic matter, and suggest that organic carbon from permafrost thaw is

  11. PACER -- A fast running computer code for the calculation of short-term containment/confinement loads following coolant boundary failure. Volume 2: User information

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J. [Argonne National Lab., IL (United States). Reactor Engineering Div.

    1997-06-01

    A fast running and simple computer code has been developed to calculate pressure loadings inside light water reactor containments/confinements under loss-of-coolant accident conditions. PACER was originally developed to calculate containment/confinement pressure and temperature time histories for loss-of-coolant accidents in Soviet-designed VVER reactors and is relevant to the activities of the US International Nuclear Safety Center. The code employs a multicompartment representation of the containment volume and is focused upon application to early time containment phenomena during and immediately following blowdown. PACER has been developed for FORTRAN 77 and earlier versions of FORTRAN. The code has been successfully compiled and executed on SUN SPARC and Hewlett-Packard HP-735 workstations provided that appropriate compiler options are specified. The code incorporates both capabilities built around a hardwired default generic VVER-440 Model V230 design as well as fairly general user-defined input. However, array dimensions are hardwired and must be changed by modifying the source code if the number of compartments/cells differs from the default number of nine. Detailed input instructions are provided as well as a description of outputs. Input files and selected output are presented for two sample problems run on both HP-735 and SUN SPARC workstations.

  12. Residuejams and their effect on Infiltration, Runoff and Dissolved Organic Carbon (DOC) in Furrow Irrigation Systems

    Science.gov (United States)

    Mailapalli, D. R.; Wallender, W. W.; Horwath, W.; Ma, S.; Lazicki, P.

    2007-12-01

    Crop residue, which consists of small debris of different sizes, is an important resource in agricultural ecosystems. It plays a vital role in conservation tillage as a best management practice (BMP) for reducing runoff, sediment, nutrient and pesticide transport from irrigated fields. In furrow irrigation, the predominant irrigation method in the world, as irrigation or winter runoff water moves along a furrow, it lifts the unanchored residue and transports across the field. The complex interaction of multiple residue pieces (debris) with itself, the soil matrix, and the fluid cause jams to form along the furrow. The residuejams can be thought of logjams in fluvial rivers or channels which help in increasing channel roughness to reduce flow velocities and shear stress along eroding banks. The logjams also create a hydraulic shadow, a low-velocity zone for some distance upstream that allows sediment to settle out and stabilize. Similarly, the residuejams help in formation of catchments, which promote increased infiltration and settlement of sediments along the furrow. The infiltration and residue interaction with the soil-water influence the runoff, sediment, nutrient and dissolved organic carbon (DOC) export. The reduction of DOC export is critical to enhancing drinking water resources by reducing reactive sources of DOC that form carcinogenic by-products in the disinfection process. Hence, investigation of geomorphology of the residuejams is essential to understand their impact on infiltration, runoff and DOC concentration. This study focuses on the formation of residuejams and their effect on the infiltration, runoff and DOC concentration from 122 m long furrow plots with cover crop (CC), no-till (NT) and standard tillage (ST). These treatments (CC, NT and ST) were replicated three times using randomized complete block design and the plots initially, had 10, 32 and 42% of residue cover (sunflower residue on ST and NT; sunflower and wheat residue on CC plot

  13. Error coding simulations

    Science.gov (United States)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  14. Direct DOC and nitrate determination in water using dual pathlength and second derivative UV spectrophotometry.

    Science.gov (United States)

    Causse, Jean; Thomas, Olivier; Jung, Aude-Valérie; Thomas, Marie-Florence

    2017-01-01

    UV spectrophotometry is largely used for water and wastewater quality monitoring. The measurement/estimation of specific and aggregate parameters such as nitrate and dissolved organic carbon (DOC) is possible with UV spectra exploitation, from 2 to multi wavelengths calibration. However, if nitrate determination from UV absorbance is known, major optical interferences linked to the presence of suspended solids, colloids or dissolved organic matter limit the relevance of UV measurement for DOC assessment. A new method based on UV spectrophotometric measurement of raw samples (without filtration) coupling a dual pathlength for spectra acquisition and the second derivative exploitation of the signal is proposed in this work. The determination of nitrate concentration is carried out from the second derivative of the absorbance at 226 nm corresponding at the inflexion point of nitrate signal decrease. A short optical pathlength can be used considering the strong absorption of nitrate ion around 210 nm. For DOC concentration determination the second derivative absorbance at 295 nm is proposed after nitrate correction. Organic matter absorbing slightly in the 270-330 nm window, a long optical pathlength must be selected in order to increase the sensitivity. The method was tested on several hundred of samples from small rivers of two agricultural watersheds located in Brittany, France, taken during dry and wet periods. The comparison between the proposed method and the standardised procedures for nitrate and DOC measurement gave a good adjustment for both parameters for ranges of 2-100 mg/L NO3 and 1-30 mg/L DOC. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Extensive degradation of terrestrial POC and DOC over the Eastern Arctic Siberian Shelf

    Science.gov (United States)

    Alling, V.; Sanchez-García, L.; Pugach, S.; Porcelli, D.; Humborg, C.; Mörth, C.-M.; van Dongen, B.; Dudarev, O.; Semiletov, I.; Gustafsson, Ö.

    2012-04-01

    The Eastern part of the Siberian Arctic is predicted to experience the highest increase in temperature on Earth as climate changes, and now observations indicate that the region is warming even faster than predicted. It has been suggested that these changes will lead to increased export of terrestrial organic carbon (particulate OC and dissolved OC). However, the fate of terrestrial OC in the Arctic Ocean is debated, and data from the eastern part of the Siberian Shelves are limited. During the International Siberian Shelf Study 2008 (ISSS-08), a 50-day research expedition onboard the Russian vessel Yakob Smirnitskiy, 260 samples were measured for POC and DOC concentrations and optical parameters in the Laptev and the East Siberian Seas. The results demonstrate that extensive removal of terrestrial derived carbon occurs over these shelves. For DOC, this was most pronounced in areas where the residence time of the freshwater exceeded one year, while the removal of POC was rapid in the low salinity zones. However, the POC shows several sources and degradation patterns along the Eastern Siberian coastline, and the degradation rate was much higher than previous estimates. Our findings suggest that a large proportion of riverine DOC is removed in the surface waters of the Eastern Siberian Arctic Shelves and that increased river discharge of DOC might cause a stronger positive feedback to global warming than expected. They also suggest that even though the riverine concentration of POC is only 10-15% of the DOC concentration, up to half of the inorganic carbon produced from degradation of terrestrial OC is due to degradation of POC.

  16. 48 CFR 1352.239-72 - Security requirements for information technology resources.

    Science.gov (United States)

    2010-10-01

    ... information (data). Information technology services include, but are not limited to, the management, operation... comply with the requirements contained in the DOC Information Technology Management Handbook (see DOC..., including project management information (at a minimum the tasks, resources, and milestones) for the...

  17. Cold storage as a method for the long-term preservation of tropical dissolved organic carbon (DOC

    Directory of Open Access Journals (Sweden)

    S. Cook

    2016-11-01

    Full Text Available Fluvial fluxes of dissolved organic carbon (DOC may represent an important loss for terrestrial carbon stores in the tropics. However, there is currently limited guidance on the preservation of tropical water samples for DOC analysis. Commonly employed preservation techniques such as freezing or acidification can limit degradation but may also alter sample properties, complicating DOC analysis. We examined the effects of cold storage at 4 °C on DOC concentration and quality in water samples collected from a tropical peat catchment. Samples were stored in the dark at 4 °C for periods of 6–12 weeks. Freeze/thaw experiments were also made. Mean DOC concentrations in samples stored for six weeks at 4 °C were 6.1 % greater than in samples stored at ambient room temperature (33 °C over the same period. Changes in DOC concentrations, in two sample sets, during cold storage were 2.25 ± 2.9 mg L-1 (8 % to 2.69 ± 1.4 mg L-1 (11 % over a 12-week period. Freeze/thaw resulted in alterations in the optical properties of samples, and this in turn altered the calculated DOC concentrations by an average of 10.9 %. We conclude that cold storage at 4 °C is an acceptable preservation method for tropical DOC water samples, for moderate time periods, and is preferable to freezing or storage at ambient temperatures.

  18. panMetaDocs, eSciDoc, and DOIDB—An Infrastructure for the Curation and Publication of File-Based Datasets for GFZ Data Services

    Directory of Open Access Journals (Sweden)

    Damian Ulbricht

    2016-03-01

    Full Text Available The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany. As part of the Helmholtz Association, providing and maintaining large-scale scientific infrastructures are an essential part of GFZ activities. This includes the generation of significant volumes and numbers of research data, which subsequently become source materials for data publications. The development and maintenance of data systems is a key component of GFZ Data Services to support state-of-the-art research. A challenge lies not only in the diversity of scientific subjects and communities, but also in different types and manifestations of how data are managed by research groups and individual scientists. The data repository of GFZ Data Services provides a flexible IT infrastructure for data storage and publication, including minting of digital object identifiers (DOI. It was built as a modular system of several independent software components linked together through Application Programming Interfaces (APIs provided by the eSciDoc framework. Principal application software are panMetaDocs for data management and DOIDB for logging and moderating data publications activities. Wherever possible, existing software solutions were integrated or adapted. A summary of our experiences made in operating this service is given. Data are described through comprehensive landing pages and supplementary documents, like journal articles or data reports, thus augmenting the scientific usability of the service.

  19. Network Coding Fundamentals and Applications

    CERN Document Server

    Medard, Muriel

    2011-01-01

    Network coding is a field of information and coding theory and is a method of attaining maximum information flow in a network. This book is an ideal introduction for the communications and network engineer, working in research and development, who needs an intuitive introduction to network coding and to the increased performance and reliability it offers in many applications. This book is an ideal introduction for the research and development communications and network engineer who needs an intuitive introduction to the theory and wishes to understand the increased performance and reliabil

  20. 1109.doc

    Indian Academy of Sciences (India)

    Yield 288 mg; 85%), M. P. 136-138 °C, [α]D -21.8 (c 1.0, CHCl3); 1H NMR (300 MHz, CDCl3): 7.38-7.56 (m, 2H), 6.63-6.98 (m, 2H), 4.49 (t, J H-H, H-P = 9.8 Hz, 1H), 4.01 (m, 4H), 3.59 (dd, J H-P = 21.8 Hz, J H-H = 7.7 Hz, 1H), 1.13 (t, J H-H = 6.8 ...

  1. 513.doc

    Indian Academy of Sciences (India)

    1H and 13C-NMR spectrum (CDCl3/DMSO-d6) was recorded with Gemini-200 and Bruker-Avance-300 instruments; chemical shifts δ in ppm relative to SiMe4 as an internal standard, couplings in Hz. HRMS (ESI) data were recorded on a QSTAR XL High resolution mass spectrometer; in m/z (rel. %). GC was recorded on ...

  2. Doc Immelman

    African Journals Online (AJOL)

    Owner

    rowers” lewer Immelman subtiel kommen- taar op 'n persoonlike ervaring met die motor- hawe deur 'n fabelagtige ontstaansverhaal van motorhawens oor te vertel. Dit is egter nie net Immelman se persoon- like griewe wat in sy rubrieke neerslag vind nie. In “Klagtes” geniet lesers se nietige griewe, wat ingestroom het ná 'n ...

  3. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  4. Dynamics of dissolved organic carbon (DOC) through stormwater basins designed for groundwater recharge in urban area: Assessment of retention efficiency.

    Science.gov (United States)

    Mermillod-Blondin, Florian; Simon, Laurent; Maazouzi, Chafik; Foulquier, Arnaud; Delolme, Cécile; Marmonier, Pierre

    2015-09-15

    Managed aquifer recharge (MAR) has been developed in many countries to limit the risk of urban flooding and compensate for reduced groundwater recharge in urban areas. The environmental performances of MAR systems like infiltration basins depend on the efficiency of soil and vadose zone to retain stormwater-derived contaminants. However, these performances need to be finely evaluated for stormwater-derived dissolved organic matter (DOM) that can affect groundwater quality. Therefore, this study examined the performance of MAR systems to process DOM during its transfer from infiltration basins to an urban aquifer. DOM characteristics (fluorescent spectroscopic properties, biodegradable and refractory fractions of dissolved organic carbon -DOC-, consumption by micro-organisms during incubation in slow filtration sediment columns) were measured in stormwater during its transfer through three infiltration basins during a stormwater event. DOC concentrations sharply decreased from surface to the aquifer for the three MAR sites. This pattern was largely due to the retention of biodegradable DOC which was more than 75% for the three MAR sites, whereas the retention of refractory DOC was more variable and globally less important (from 18% to 61% depending on MAR site). Slow filtration column experiments also showed that DOC retention during stormwater infiltration through soil and vadose zone was mainly due to aerobic microbial consumption of the biodegradable fraction of DOC. In parallel, measurements of DOM characteristics from groundwaters influenced or not by MAR demonstrated that stormwater infiltration increased DOC quantity without affecting its quality (% of biodegradable DOC and relative aromatic carbon content -estimated by SUVA254-). The present study demonstrated that processes occurring in soil and vadose zone of MAR sites were enough efficient to limit DOC fluxes to the aquifer. Nevertheless, the enrichments of DOC concentrations measured in groundwater below

  5. Impact of catchment geophysical characteristics and climate on the regional variability of dissolved organic carbon (DOC) in surface water.

    Science.gov (United States)

    Cool, Geneviève; Lebel, Alexandre; Sadiq, Rehan; Rodriguez, Manuel J

    2014-08-15

    Dissolved organic carbon (DOC) is a recognized indicator of natural organic matter (NOM) in surface waters. The aim of this paper is twofold: to evaluate the impact of geophysical characteristics, climate and ecological zones on DOC concentrations in surface waters and, to develop a statistical model to estimate the regional variability of these concentrations. In this study, multilevel statistical analysis was used to achieve three specific objectives: (1) evaluate the influence of climate and geophysical characteristics on DOC concentrations in surface waters; (2) compare the influence of geophysical characteristics and ecological zones on DOC concentrations in surface waters; and (3) develop a model to estimate the most accurate DOC concentrations in surface waters. The case study involved 115 catchments from surface waters in the Province of Quebec, Canada. Results showed that mean temperatures recorded 60 days prior to sampling, total precipitation 10 days prior to sampling and percentages of wetlands, coniferous forests and mixed forests have a significant positive influence on DOC concentrations in surface waters. The catchment mean slope had a significant negative influence on DOC concentrations in surface waters. Water type (lake or river) and deciduous forest variables were not significant. The ecological zones had a significant influence on DOC concentrations. However, geophysical characteristics (wetlands, forests and slope) estimated DOC concentrations more accurately. A model describing the variability of DOC concentrations was developed and can be used, in future research, for estimating DBPs in drinking water as well evaluating the impact of climate change on the quality of surface waters and drinking water. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Reactive modelling of 1,2-DCA and DOC near the shoreline

    Science.gov (United States)

    Colombani, N.; Pantano, A.; Mastrocicco, M.; Petitta, M.

    2014-11-01

    1,2-Dichloroethane (1,2-DCA) was found to be the most abundant compound among chlorinated hydrocarbons detected in a petrochemical plant in southern Italy. This site is located near the coastline, and it is set above an unconfined coastal aquifer, where seawater intrusion is present. The presence of organic and inorganic contaminants at this site has required the implementation of remediation strategies, consisting of pumping wells (hydraulic barrier) and a horizontal flow barrier. The purpose of this work was to assess the influence of salt water intrusion on the degradation rate of 1,2-DCA. This was done on a three-dimensional domain relative to a limited portion of a well characterized field site, accounting for density-dependent flow and reactive transport modelling of 1,2-DCA and Dissolved Organic Carbon (DOC). The modelling procedure was performed employing SEAWAT-4.0 and PHT3D, to reproduce the complex three-dimensional flow and transport domain. In order to determine the fate of 1,2-DCA, detailed field investigations provided intensive depth profile information. Different, kinetically controlled degradation rates were simulated to explain the observed, selective degradation of pollutants in groundwater. Calibration of the model was accomplished by comparison with the two different sets of measurements obtained from the MLS devices and from pumping wells. With the calibrated model, it was possible to distinguish between dispersive non-reactive processes and bacterially mediated reactions. In the non-reactive model, 1,2-DCA sorption was simulated using linear sorption coefficient determined with field data and 1,2-DCA degradation was simulated using a first order decay coefficient using literature data as initial guess. Finally, on the reactive transport model, where a two-step approach with partial equilibrium approach was implemented, the effects of neglecting the cation exchange capacity, omitting density-dependent flow, and refining the vertical

  7. Building Codes

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Rasmussen, Birgit

    1996-01-01

    A state-of-the-art survey concerning acoustic conditions in dwellings has been carried out in 1994. A review of existing investigations related to subjective and/or objective evaluation of dwellings was done, and several countries were contacted to get up-to-date information about the legal acous...

  8. Building Codes

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Rasmussen, Birgit

    1996-01-01

    A state-of-the-art survey concerning acoustic conditions in dwellings has been carried out in 1994. A review of existing investigations related to subjective and/or objective evaluation of dwellings was done, and several countries were contacted to get up-to-date information about the legal...

  9. Web 2.0: Google docs no processo de ensino e aprendizagem

    OpenAIRE

    Miranda, Luísa; Morais, Carlos; Alves, Paulo; Dias, Paulo

    2008-01-01

    Nesta reflexão referem-se algumas das características da Web 2.0 e apresenta-se uma experiência de utilização do Google Docs por uma amostra de alunos do ensino superior, no contexto formal de ensino e aprendizagem. Sobre a utilização do Google Docs salientam-se as opiniões da professora acerca do trabalho desenvolvido pelos alunos e as percepções dos alunos identificadas a partir das respostas dadas a um questionário construído para o efeito. De acordo com a opinião da professora, o Google D...

  10. Microsoft Word - Doc2.doc

    African Journals Online (AJOL)

    MBY

    congresos/congre?006/cornape. Reviewed July 2007. 12. Mena Miranda WR.,. Pérez Cruz JA, Salvato Dueñas A y Levy O N. Morbilidad y mortalidad por sindrome hemolitico urémico Rev Cubana Pediatr W.70 n.1 Ciudad de la Habana ene.

  11. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  12. Fido, a novel AMPylation domain common to fic, doc, and AvrB.

    Directory of Open Access Journals (Sweden)

    Lisa N Kinch

    2009-06-01

    Full Text Available The Vibrio parahaemolyticus type III secreted effector VopS contains a fic domain that covalently modifies Rho GTPase threonine with AMP to inhibit downstream signaling events in host cells. The VopS fic domain includes a conserved sequence motif (HPFx[D/E]GN[G/K]R that contributes to AMPylation. Fic domains are found in a variety of species, including bacteria, a few archaea, and metazoan eukaryotes.We show that the AMPylation activity extends to a eukaryotic fic domain in Drosophila melanogaster CG9523, and use sequence and structure based computational methods to identify related domains in doc toxins and the type III effector AvrB. The conserved sequence motif that contributes to AMPylation unites fic with doc. Although AvrB lacks this motif, its structure reveals a similar topology to the fic and doc folds. AvrB binds to a peptide fragment of its host virulence target in a similar manner as fic binds peptide substrate. AvrB also orients a phosphate group from a bound ADP ligand near the peptide-binding site and in a similar position as a bound fic phosphate.The demonstrated eukaryotic fic domain AMPylation activity suggests that the VopS effector has exploited a novel host posttranslational modification. Fic domain-related structures give insight to the AMPylation active site and to the VopS fic domain interaction with its host GTPase target. These results suggest that fic, doc, and AvrB stem from a common ancestor that has evolved to AMPylate protein substrates.

  13. TogoDoc server/client system: smart recommendation and efficient management of life science literature.

    Science.gov (United States)

    Iwasaki, Wataru; Yamamoto, Yasunori; Takagi, Toshihisa

    2010-12-13

    In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration). The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the "tsunami" of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past). The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom.

  14. TogoDoc server/client system: smart recommendation and efficient management of life science literature.

    Directory of Open Access Journals (Sweden)

    Wataru Iwasaki

    Full Text Available In this paper, we describe a server/client literature management system specialized for the life science domain, the TogoDoc system (Togo, pronounced Toe-Go, is a romanization of a Japanese word for integration. The server and the client program cooperate closely over the Internet to provide life scientists with an effective literature recommendation service and efficient literature management. The content-based and personalized literature recommendation helps researchers to isolate interesting papers from the "tsunami" of literature, in which, on average, more than one biomedical paper is added to MEDLINE every minute. Because researchers these days need to cover updates of much wider topics to generate hypotheses using massive datasets obtained from public databases or omics experiments, the importance of having an effective literature recommendation service is rising. The automatic recommendation is based on the content of personal literature libraries of electronic PDF papers. The client program automatically analyzes these files, which are sometimes deeply buried in storage disks of researchers' personal computers. Just saving PDF papers to the designated folders makes the client program automatically analyze and retrieve metadata, rename file names, synchronize the data to the server, and receive the recommendation lists of newly published papers, thus accomplishing effortless literature management. In addition, the tag suggestion and associative search functions are provided for easy classification of and access to past papers (researchers who read many papers sometimes only vaguely remember or completely forget what they read in the past. The TogoDoc system is available for both Windows and Mac OS X and is free. The TogoDoc Client software is available at http://tdc.cb.k.u-tokyo.ac.jp/, and the TogoDoc server is available at https://docman.dbcls.jp/pubmed_recom.

  15. Careers and Networking: Professional Development for Graduate Students and Post-docs

    Science.gov (United States)

    Jungbluth, S.; Boiteau, R.; Bottjer, D.; De Leo, F. C.; Hawko, N.; Ilikchyan, I.; Bruno, B. C.

    2013-12-01

    Established in 2006 by the National Science Foundation, the Center for Microbial Oceanography: Research and Education (C-MORE) is a multi-institutional Science and Technology Center based at the University of Hawai i. One of C-MORE's missions is to provide graduate students and post-docs with state-of-the-art training, which primarily occurs through laboratory- and field-based research. Additionally, C-MORE offers a Professional Development Training Program (PDTP) to help students and post-docs develop a range of "soft" skills such as science communication, leadership, proposal writing, teaching and mentoring (Bruno et al, 2013). The PDTP not only provides professional development training to graduate students and post-docs, but also encourages these young scientists to take leadership of their training. The Professional Development Organizing Committee (PDOC), composed of students and post-docs across the various C-MORE institutions, works closely with the Education Office to implement the eight core PDTP modules as well as 'on-demand' workshops. In February 2013, we organized a workshop to promote networking and foster scientific collaborations among C-MORE graduate students and post-doctoral researchers at the seven partner institutions: the University of Hawaii, Massachusetts Institute of Technology, Woods Hole Oceanographic Institution, Oregon State University, University of California Santa Cruz, Monterey Bay Aquarium Research Institute and Columbia University. The workshop was held in New Orleans in conjunction with the 2013 ASLO/ Ocean Sciences national meeting. In this paper, we will describe the student-led planning process, the workshop itself, and evaluation results. We will also present examples of some of the collaborations that resulted from this workshop.

  16. Structure and Function in Fluvials Biofilms. Implications in River DOC Dynamics and Nuisance Metabolite Production

    OpenAIRE

    Vilalta Baliellas, Elisabet

    2004-01-01

    The role of natural biofilms affecting the water quality in rivers has been the main theme in this study. Firstly, the study developed the capacity of biofilms in retention and/or production of DOC. Secondly, the study also approached the production of the geosmin metabolite by benthic cyanobacterial mats. In the two developed aspects, the structure and function of the biofilms showed their relevance in evaluating the capacity of biofilms on the amelioration of the water quality. The importan...

  17. Assessing Environmental Drivers of DOC Fluxes in the Shark River Estuary: Modeling the Effects of Climate, Hydrology and Water Management

    Science.gov (United States)

    Regier, P.; Briceno, H.; Jaffe, R.

    2016-02-01

    Urban and agricultural development of the South Florida peninsula has disrupted freshwater flow in the Everglades, a hydrologically connected ecosystem stretching from central Florida to the Gulf of Mexico. Current system-scale restoration efforts aim to restore natural hydrologic regimes to reestablish pre-drainage ecosystem functioning through increased water availability, quality and timing. However, it is uncertain how hydrologic restoration combined with climate change will affect the downstream section of the system, including the mangrove estuaries of Everglades National Park. Aquatic transport of carbon, primarily as dissolved organic carbon (DOC), plays a critical role in biogeochemical cycling and food-web dynamics, and will be affected both by water management policies and climate change. To better understand DOC dynamics in these estuaries and how hydrology, climate and water management may affect them, 14 years of monthly data collected in the Shark River estuary were used to build a DOC flux model. Multi-variate methods were applied to long-term data-sets for hydrology, water quality and climate to untangle the interconnected environmental drivers that control DOC export at intra and inter-annual scales. DOC fluxes were determined to be primarily controlled by hydrology but also by seasonality and long-term climate patterns. Next, a 4-component model (salinity, inflow, rainfall, Atlantic Multidecadal Oscillation) capable of predicting DOC fluxes (R2=0.78, p<0.0001, n=161) was established. Finally, potential climate change scenarios for the Everglades were applied to this model to assess DOC flux variations in response to climate and restoration variables. Although global predictions anticipate that DOC export will generally increase in the future, the majority of scenario runs indicated that DOC export from the Everglades is expected to decrease due to changes in rainfall, evapotranspiration, inflows and sea-level rise.

  18. Source code retrieval using conceptual similarity

    NARCIS (Netherlands)

    Mishne, G.A.; de Rijke, M.

    2004-01-01

    We propose a method for retrieving segments of source code from a large repository. The method is based on conceptual modeling of the code, combining information extracted from the structure of the code and standard informationdistance measures. Our results show an improvement over traditional

  19. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  20. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  1. Synthetic histone code.

    Science.gov (United States)

    Fischle, Wolfgang; Mootz, Henning D; Schwarzer, Dirk

    2015-10-01

    Chromatin is the universal template of genetic information in all eukaryotic cells. This complex of DNA and histone proteins not only packages and organizes genomes but also regulates gene expression. A multitude of posttranslational histone modifications and their combinations are thought to constitute a code for directing distinct structural and functional states of chromatin. Methods of protein chemistry, including protein semisynthesis, amber suppression technology, and cysteine bioconjugation, have enabled the generation of so-called designer chromatin containing histones in defined and homogeneous modification states. Several of these approaches have matured from proof-of-concept studies into efficient tools and technologies for studying the biochemistry of chromatin regulation and for interrogating the histone code. We summarize pioneering experiments and recent developments in this exciting field of chemical biology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Content layer progressive coding of digital maps

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Ole Riis

    2000-01-01

    A new lossless context based method is presented for content progressive coding of limited bits/pixel images, such as maps, company logos, etc., common on the WWW. Progressive encoding is achieved by separating the image into content layers based on other predefined information. Information from...... already coded layers are used when coding subsequent layers. This approach is combined with efficient template based context bi-level coding, context collapsing methods for multi-level images and arithmetic coding. Relative pixel patterns are used to collapse contexts. The number of contexts are analyzed....... The new methods outperform existing coding schemes coding digital maps and in addition provide progressive coding. Compared to the state-of-the-art PWC coder, the compressed size is reduced to 60-70% on our layered test images....

  3. Error coding simulations in C

    Science.gov (United States)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  4. Civil Code, 11 December 1987.

    Science.gov (United States)

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  5. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show...

  6. Impact of changing DOC concentrations on the potential distribution of acid sensitive biota in a boreal stream network

    Directory of Open Access Journals (Sweden)

    H. Laudon

    2008-03-01

    Full Text Available DOC concentrations have increased in many surface waters in Europe and North America over the past few decades. As DOC exudes a strong influence on pH this DOC increase could have detrimental effects on acid sensitive biota in many streams and lakes. To investigate the potential implications of changes in the DOC concentration on stream water biota, we have used a mesoscale boreal stream network in northern Sweden as a case study. The network was sampled for stream water chemistry at 60 locations during both winter base flow and spring flood periods, representing the extremes experienced annually in these streams both in terms of discharge and acidity. The effect of changing DOC on pH was modeled for all sampling locations using an organic acid model, with input DOC concentrations for different scenarios adjusted by between −30% and +50% from measured present concentrations. The resulting effect on pH was then used to quantify the proportion of stream length in the catchment with pH below the acid thresholds of pH 5.5 and pH 5.0. The results suggest that a change in stream water DOC during base flow would have only a limited effect on pH and hence on the stream length with pH below the acid thresholds. During the spring flood on the other hand a change in DOC would strongly influence pH and the stream length with pH below the acid thresholds. For example an increase in DOC concentration of 30% at all sites would increase the proportion of stream length with pH below 5.5 from 37% to 65%, and the proportion of stream length with pH below 5.0 would increase from 18% to 27%. The results suggest that in high DOC waters, even a marginal change in the DOC concentration could impact acid sensitive biota in a large portion of the aquatic landscape.

  7. A 125 year long record of DOC flux from a major temperate catchment: land-use vs. climate control?

    Science.gov (United States)

    Clay, G.; Worrall, F.; Howden, N. K.; Burt, T. P.

    2010-12-01

    Our understanding of the controls upon carbon biogeochemistry has always been limited by lack of long term observational data at the same time as having long term monitoring of possible environmental drivers. For the River Thames catchment in the UK (9998 km2) records of DOM have been kept since 1868 and DOM flux since 1882. In addition to riverflow being monitored in the catchment there has also been monitoring of climate, land-use and population back to at least 1868. The Thames catchment is a mixed agricultural urban catchment dominated by mineral soils where groundwater plays a significant part in the catchments flow system. During the period of the record the catchment has undergone urbanisation, climate warming but has also undergone large-scale land use change associated with World War II and agricultural intensification in the 1960s. The importance of these combinations of pressures are explored in the time series through a range of time series techniques and the results show: i) That DOC flux in the catchment is now at historic low levels, with the maximum flux being 35 ktonnes C/yr (3.5 tonnes/km2/yr) in 1915 and the lowest flux being 2 ktonnes C/yr (0.2 tonnes/km2/yr) in 1997. ii) The trend in the DOC flux is explained by changes in flow, which appear associated with both with groundwater storage in the catchment and with changes in land-use. iii) The significant decline in the DOC flux appears to be due to the transition in the catchment from dominated from pasture to an arable land use. iv) The decline of DOC flux with temperature would suggest that DOC mineralisation reaction has a higher Q10 than the DOC production. v) Declining DOC flux from mineral soils catchments would offset increases in DOC flux from organic soils but would also represent a shift in carbon losses from fluvial to being direct to the atmosphere.

  8. Linking High Frequency Variations in Stream Water DOC to Ages of Water Sources in Peat-Dominated Montane Watersheds

    Science.gov (United States)

    Tunaley, C.; Tetzlaff, D.; Lessels, J. S.; Soulsby, C.

    2015-12-01

    We combined time series of inferred DOC (from optical sensors) and stable isotopes in streams and watershed source areas to assess the link between water age and C fluxes. We monitored temporal dynamics of FDOM for 2 yrs at nested scales (0.9, 3.0 and 30km2) in a montane Scottish watershed. FDOM was strongly correlated (r2 ~ 0.8) with DOC allowing inference of 15 min timeseries. Marked seasonality was observed, with highest DOC concentrations (~25 mg l-1) in summer events and lower concentrations (~5mg l-1) in winter. During events, anticlockwise hysteresis was observed; consistent with expansion of the riparian saturation zone, increasing hydrological connectivity across peat soils and mobilizing DOC. Lag times for peak discharge and DOC were 1-12 hrs depending on event characteristics and antecedent conditions. Isotope time series from precipitation, streams and catchment source waters (overland flow and hillslope drainage) were also generated. These allowed us to model the non-stationary characteristics of their ages. Stream water age ranges from 3 months at high flows when overland flow dominates runoff to 4 yrs under baseflow. Overland flow age was a dominant influence on DOC transport. Highest concentrations occurred in small summer events with relatively young (management strategies.

  9. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is propose...... including an outer BCH code correcting a few bit errors.......The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  10. Exploring the potential of DOC fluorescence as proxy for groundwater contamination by pesticides

    Science.gov (United States)

    Farlin, Julien; Gallé, Tom; Bayerle, Michael; Pittois, Denis; Huck, viola

    2017-04-01

    Of the different water quality surrogates the fluorescence of dissolved organic content (FDOC) appears particularly promising due to its sensitivity and specificity. A complete spectrum of FDOC can be obtained using bench top instruments scanning a spectral space going from short wavelength UV to visible blue, yielding a so-called an excitation-emission matrix (EEM). The raw EEM can be either used directly for correlation analysis with the variable of interest, or first decomposed into underlying elements corresponding to different groups of organic compounds displaying similar properties using multiway techniques such as Parallel factor analysis (PARAFAC). Fluorescence spectroscopy has up to now only rarely been applied specifically to groundwater environments. The objective of the project was to explore systematically the possibilities offered by FDOC and PARAFAC for the assessment of groundwater contamination by pesticides, taking into account the transit time from the pesticide source to the groundwater outlet. Three sites corresponding to different transit times were sampled: -one spring regularly contaminated by surface water from a nearby stream (sub-daily to daily response to fast-flow generating storm events) -one spring displaying a weekly to monthly response to interflow -sampling along a flowline consisting of a series of springs and an observation well situated upgradient with mean transit times difference of several years Preliminary results show that a three component PARAFAC model is sufficient to decompose the raw EEMs, which is less than the seven or eight component models often encountered in surface water studies. For the first site, one component in the protein-like region 275(excitation)/310 (emission) nm measured in the stream samples was filtrered completely by the aquifer and did not appear in the spring samples. The other two components followed roughly the trend of the DOC and pesticide breakthrough. For the second site, soil sampling of

  11. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  12. Generalized concatenated quantum codes

    Science.gov (United States)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng, Bei

    2009-05-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  13. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  14. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  15. Coding for Correlated Sources with Unknown Parameters.

    Science.gov (United States)

    1980-05-01

    D. Davisson , "Universal source coding," IEEE Transactions on Information Theory, vol. IT-19, pp. 783-795, 1973. 13. D. Neuhoff, R. Gray, and L... Davisson , "Fixed rate universal source coding with a fidelity criterion," IEEE Transactions on Information Theory, vol. IT-21, pp. 511-523, 1975. 14. D

  16. Content Layer progressive Coding of Digital Maps

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Ole Riis

    2002-01-01

    for calculating the resulting number of contexts are given. The new methods outperform existing schemes coding digital maps and in addition provide progressive coding. Compared to the state-of-the-art PWC coder, the compressed size is reduced to 50-70% on our layered map test images.......A new lossless context based method is presented for content progressive coding of limited bits/pixel images, such as maps, company logos, etc., common on the World Wide Web. Progressive encoding is achieved by encoding the image in content layers based on color level or other predefined...... information. Information from already coded layers are used when coding subsequent layers. This approach is combined with efficient template based context bilevel coding, context collapsing methods for multilevel images and arithmetic coding. Relative pixel patterns are used to collapse contexts. Expressions...

  17. Temporal-spatial variation of DOC concentration, UV absorbance and the flux estimation in the Lower Dagu River, China

    Science.gov (United States)

    Xi, Min; Kong, Fanlong; Li, Yue; Kong, Fanting

    2017-12-01

    Dissolved organic carbon (DOC) is an important component for both carbon cycle and energy balance. The concentration, UV absorbance, and export flux of DOC in the natural environment dominate many important transport processes. To better understand the temporal and spatial variation of DOC, 7 sites along the Lower Dagu River were chosen to conduct a comprehensive measurement from March 2013 to February 2014. Specifically, water samples were collected from the Lower Dagu River between the 26th and 29th of every month during the experimental period. The DOC concentration (CDOC) and UV absorbance were analyzed using a total organic carbon analyzer and the ultraviolet-visible absorption spectrum, and the DOC export flux was estimated with a simple empirical model. The results showed that the CDOC of the Lower Dagu River varied from 1.32 to 12.56 mg/L, consistent with global rivers. The CDOC and UV absorbance showed significant spatial variation in the Dagu River during the experiential period because of the upstream natural processes and human activities in the watershed. The spatial variation is mainly due to dam or reservoir constructions, riverside ecological environment changes, and non-point source or wastewater discharge. The seasonal variation of CDOC was mainly related to the source of water DOC, river runoff, and temperature, and the UV absorbance and humification degree of DOC had no obvious differences among months ( Ptest the CDOC in Lower Dagu River using wave lengths of 254 and 280 nm. The results revealed that the annual DOC export flux varied from 1.6 to 3.76 × 105 g C/km2/yr in a complete hydrological year, significantly lower than the global average. It is worth mentioning that the DOC export flux was mainly concentrated in summer (˜90% of all-year flux in July and August), since the runoff in the Dagu River took place frequently in summer. These observations implied environment change could bring the temporal-spatial variation of DOC and the

  18. Temporal-spatial variation of DOC concentration, UV absorbance and the flux estimation in the Lower Dagu River, China

    Science.gov (United States)

    Xi, Min; Kong, Fanlong; Li, Yue; Kong, Fanting

    2017-04-01

    Dissolved organic carbon (DOC) is an important component for both carbon cycle and energy balance. The concentration, UV absorbance, and export flux of DOC in the natural environment dominate many important transport processes. To better understand the temporal and spatial variation of DOC, 7 sites along the Lower Dagu River were chosen to conduct a comprehensive measurement from March 2013 to February 2014. Specifically, water samples were collected from the Lower Dagu River between the 26th and 29th of every month during the experimental period. The DOC concentration (CDOC) and UVabsorbance were analyzed using a total organic carbon analyzer and the ultraviolet-visible absorption spectrum, and the DOC export flux was estimated with a simple empirical model. The results showed that the CDOC of the Lower Dagu River varied from 1.32 to 12.56 mg/L, consistent with global rivers. The CDOC and UV absorbance showed significant spatial variation in the Dagu River during the experiential period because of the upstream natural processes and human activities in the watershed. The spatial variation is mainly due to dam or reservoir constructions, riverside ecological environment changes, and non-point source or wastewater discharge. The seasonal variation of CDOC was mainly related to the source of water DOC, river runoff, and temperature, and the UV absorbance and humification degree of DOC had no obvious differences among months (P<0.05). UV absorbance was applied to test the CDOC in Lower Dagu River using wave lengths of 254 and 280 nm. The results revealed that the annual DOC export flux varied from 1.6 to 3.76 × 105 g C/km2/yr in a complete hydrological year, significantly lower than the global average. It is worth mentioning that the DOC export flux was mainly concentrated in summer ( 90% of all-year flux in July and August), since the runoff in the Dagu River took place frequently in summer. These observations implied environment change could bring the temporal

  19. Influence of porewater velocity and ionic strength on DOC concentrations in and losses from peat-sand mixtures

    Science.gov (United States)

    Pfaffner, Nora; Tiemeyer, Bärbel; Fiedler, Sabine

    2015-04-01

    Organic soils play an important role in the global carbon cycle as they can act as a source or a sink for greenhouse gas emissions. The new IPCC Wetlands Supplement accounts for the first time for CO2 emissions from the decomposition of dissolved organic carbon (DOC). While there is a wealth of studies on "true" peat soils, knowledge on DOC losses from organic soils heavily disturbed by e.g. mixing with sand is fragmentary. Moreover, there are only a few studies on the influence of soil hydrological properties on DOC transport. This study investigates physico-chemical controls on the concentration and losses of DOC from a peat-sand mixture in a saturated column experiment with undisturbed columns. The soil originates from the study site "Grosses Moor" (Northern Germany) which is a former bog where peat layers remaining after peat mining were mixed with the underlying mineral soil. We studied the influence of the flow regime and the ionic strength of the irrigation solution on DOC concentrations and losses. Three different pumping rates and two different ionic strengths determined by different concentrations of a sodium chloride-calcium chloride mixture in the irrigation solution were applied. Transport properties of the soil were obtained by analyzing breakthrough curves (BTCs) of a conservative tracer (potassium bromide). For interpretation of the BTCs, the transport model STANMOD which is based on the two-region (mobile/immobile) non-equilibrium concept was fitted to the data. The shape of the BTCs and the STANMOD results showed that three of the four columns had a dual porosity structure, which affects the porewater velocity and the contact area. After a large initial peak, DOC concentrations equilibrated to nearly constant values. Increased porewater velocities decreased the concentration of DOC, but increased the losses. A new equilibrium concentration was reached after nearly all changes of the porewater velocity. At maximum pumping rates as determined from

  20. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stregy, Seth [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Dasilva, Ana [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Yilmaz, Serkan [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Saha, Pradip [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Loewen, Eric [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States)

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  1. Ecoacoustic codes and ecological complexity.

    Science.gov (United States)

    Farina, Almo

    2018-02-01

    Multi-layer communication and sensing network assures the exchange of relevant information between animals and their umwelten, imparting complexity to the ecological systems. Individual soniferous species, the acoustic community, and soundscape are the three main operational levels that comprise this multi-layer network. Acoustic adaptation and acoustic niche are two more important mechanisms that regulate the acoustic performances at the first level while the acoustic community model explains the complexity of the interspecific acoustic network at the second level. Acoustic habitat and ecoacoustic events are two of the most relevant mechanisms that operate at the third level. The exchange of ecoacoustic information on each of these levels is assured by ecoacoustic codes. At the level of individual sonifeorus species, a dyadic intraspecific exchange of information is established between an emitter and a receiver. Ecoacoustic codes discriminate, identify, and label specific signals that pertain to the theme, variation, motif repetition, and intensity of signals. At the acoustic community level, a voluntarily or involuntarily communication is established between networks of interspecific emitters and receivers. Ecoacoustic codes at this level transmit information (e.g., recognition of predators, location of food sources, availability and location of refuges) between one species and the acoustically interacting community and impart cohesion to interspecific assemblages. At the soundscape level, acoustic information is transferred from a mosaic of geophonies, biophonies, and technophonies to different species that discriminate meaningful ecoacoustic events and their temporal dynamics during habitat selection processes. Ecoacoustic codes at this level operate on a limited set of signals from the environmental acoustic dynamic that are heterogeneous in time and space, and these codes are interpreted differently according to the species during habitat selection and the

  2. The Serializability of Network Codes

    CERN Document Server

    Blasiak, Anna

    2010-01-01

    Network coding theory studies the transmission of information in networks whose vertices may perform nontrivial encoding and decoding operations on data as it passes through the network. The main approach to deciding the feasibility of network coding problems aims to reduce the problem to optimization over a polytope of entropic vectors subject to constraints imposed by the network structure. In the case of directed acyclic graphs, these constraints are completely understood, but for general graphs the problem of enumerating them remains open: it is not known how to classify the constraints implied by a property that we call serializability, which refers to the absence of paradoxical circular dependencies in a network code. In this work we initiate the first systematic study of the constraints imposed on a network code by serializability. We find that serializability cannot be detected solely by evaluating the Shannon entropy of edge sets in the graph, but nevertheless, we give a polynomial-time algorithm tha...

  3. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  4. Enseñanza individual a través de Skype y Google Docs

    Directory of Open Access Journals (Sweden)

    Agata Łazor

    2013-01-01

    Full Text Available El artículo presenta ideas para clases on-line utilizando Google Docs y Skype. En la parte teórica se explica el concepto de enseñanza virtual asincrónica y sincrónica. Se comentan las ventajas e inconvenientes de ambos modos de enseñanza.Los ejercicios prácticos utilizan sobre todo los métodos y herramientas sincrónicos y tienen el propósito de utilizar varias aplicaciones de Google Docs. Todas las tareas se adaptan a diferentes estilos de aprendizaje. La primera está relacionada con recordar y repetir colocaciones y expresiones. En el segundo ejercicio se utiliza una fotografía y se la etiqueta aprovechando la opción dada por Google Drawing. En la tercera tarea se aplican tanto las estrategias de e-learning sincrónico como asincrónico y consiste en la corrección on-line de una tarea previamente escrita. La última tarea implica la búsqueda en internet de una página web y la simulación de un intercambio de e-mails con el propietario de un piso.

  5. Enseñanza individual a través de Skype y Google Docs

    Directory of Open Access Journals (Sweden)

    Agata Łazor

    2013-07-01

    Full Text Available El artículo presenta ideas para clases on-line utilizando Google Docs ySkype. En la parte teórica se explica el concepto de enseñanza virtual asincrónica ysincrónica. Se comentan las ventajas e inconvenientes de ambos modos de enseñanza.Los ejercicios prácticos utilizan sobre todo los métodos y herramientas sincrónicos ytienen el propósito de utilizar varias aplicaciones de Google Docs. Todas las tareas seadaptan a diferentes estilos de aprendizaje. La primera está relacionada con recordar yrepetir colocaciones y expresiones. En el segundo ejercicio se utiliza una fotografía y sela etiqueta aprovechando la opción dada por Google Drawing. En la tercera tarea seaplican tanto las estrategias de e-learning sincrónico como asincrónico y consiste en lacorrección on-line de una tarea previamente escrita. La última tarea implica la búsquedaen internet de una página web y la simulación de un intercambio de e-mails con elpropietario de un piso.

  6. The Development of a Code of Ethics: An Online Classroom Approach to Making Connections between Ethical Foundations and the Challenges Presented by Information Technology

    Science.gov (United States)

    Brooks, Rochelle

    2010-01-01

    In today's organizations, ethical challenges relate to areas like fraud, right to privacy for consumers, social responsibility, and trade restrictions. For Information Technology (IT) specifically, these can translate to considerations on how technology is used to violate people's privacy, how automation leads to job reductions, or how management…

  7. Drainage in Shallow Peatlands of Marginal Upland Landscapes: DOC Losses from High Flow Events

    Science.gov (United States)

    Grand-Clement, E.; Anderson, K.; Luscombe, D.; Gatis, N.; Benaud, P.; Brazier, R.

    2013-12-01

    ) and finally headwater catchment-scales. Flow monitoring has been in place at all scales since November 2010. Flow-proportional water samples were collected during a range of events throughout winter 2011 to 2013 and analysed for Dissolved Organic Carbon (DOC) and colour, as these variables were identified as critical, both in terms of carbon cycling and for costly water treatment that currently takes place downstream. DOC fluxes were examined temporarily and spatially in relation to season, drain sizes, and magnitude/frequency of event. First results show higher DOC concentrations during rain events occurring in the summer compared to winter times, due to generally drier conditions. DOC fluxes per 24h rain events reach up to 3g/m2. Such measurements are used to evaluate annual DOC losses at the scale of the whole catchment. This will help improving our understanding of carbon losses and fluxes in streams from damaged peatlands, and further estimate the impact on the supply of ecosystem services, and the potential for improvement that can be expected following restoration.

  8. Code of ethics: principles for ethical leadership.

    Science.gov (United States)

    Flite, Cathy A; Harman, Laurinda B

    2013-01-01

    The code of ethics for a professional association incorporates values, principles, and professional standards. A review and comparative analysis of a 1934 pledge and codes of ethics from 1957, 1977, 1988, 1998, 2004, and 2011 for a health information management association was conducted. Highlights of some changes in the healthcare delivery system are identified as a general context for the codes of ethics. The codes of ethics are examined in terms of professional values and changes in the language used to express the principles of the various codes.

  9. A (72, 36; 15) box code

    Science.gov (United States)

    Solomon, G.

    1993-01-01

    A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.

  10. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  11. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  12. Algebraic geometric codes

    Science.gov (United States)

    Shahshahani, M.

    1991-01-01

    The performance characteristics are discussed of certain algebraic geometric codes. Algebraic geometric codes have good minimum distance properties. On many channels they outperform other comparable block codes; therefore, one would expect them eventually to replace some of the block codes used in communications systems. It is suggested that it is unlikely that they will become useful substitutes for the Reed-Solomon codes used by the Deep Space Network in the near future. However, they may be applicable to systems where the signal to noise ratio is sufficiently high so that block codes would be more suitable than convolutional or concatenated codes.

  13. Monomial-like codes

    CERN Document Server

    Martinez-Moro, Edgar; Ozbudak, Ferruh; Szabo, Steve

    2010-01-01

    As a generalization of cyclic codes of length p^s over F_{p^a}, we study n-dimensional cyclic codes of length p^{s_1} X ... X p^{s_n} over F_{p^a} generated by a single "monomial". Namely, we study multi-variable cyclic codes of the form in F_{p^a}[x_1...x_n] / . We call such codes monomial-like codes. We show that these codes arise from the product of certain single variable codes and we determine their minimum Hamming distance. We determine the dual of monomial-like codes yielding a parity check matrix. We also present an alternative way of constructing a parity check matrix using the Hasse derivative. We study the weight hierarchy of certain monomial like codes. We simplify an expression that gives us the weight hierarchy of these codes.

  14. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  15. Layered Wyner-Ziv video coding.

    Science.gov (United States)

    Xu, Qian; Xiong, Zixiang

    2006-12-01

    Following recent theoretical works on successive Wyner-Ziv coding (WZC), we propose a practical layered Wyner-Ziv video coder using the DCT, nested scalar quantization, and irregular LDPC code based Slepian-Wolf coding (or lossless source coding with side information at the decoder). Our main novelty is to use the base layer of a standard scalable video coder (e.g., MPEG-4/H.26L FGS or H.263+) as the decoder side information and perform layered WZC for quality enhancement. Similar to FGS coding, there is no performance difference between layered and monolithic WZC when the enhancement bitstream is generated in our proposed coder. Using an H.26L coded version as the base layer, experiments indicate that WZC gives slightly worse performance than FGS coding when the channel (for both the base and enhancement layers) is noiseless. However, when the channel is noisy, extensive simulations of video transmission over wireless networks conforming to the CDMA2000 1X standard show that H.26L base layer coding plus Wyner-Ziv enhancement layer coding are more robust against channel errors than H.26L FGS coding. These results demonstrate that layered Wyner-Ziv video coding is a promising new technique for video streaming over wireless networks.

  16. Multiplexed coding in the human basal ganglia

    Science.gov (United States)

    Andres, D. S.; Cerquetti, D.; Merello, M.

    2016-04-01

    A classic controversy in neuroscience is whether information carried by spike trains is encoded by a time averaged measure (e.g. a rate code), or by complex time patterns (i.e. a time code). Here we apply a tool to quantitatively analyze the neural code. We make use of an algorithm based on the calculation of the temporal structure function, which permits to distinguish what scales of a signal are dominated by a complex temporal organization or a randomly generated process. In terms of the neural code, this kind of analysis makes it possible to detect temporal scales at which a time patterns coding scheme or alternatively a rate code are present. Additionally, finding the temporal scale at which the correlation between interspike intervals fades, the length of the basic information unit of the code can be established, and hence the word length of the code can be found. We apply this algorithm to neuronal recordings obtained from the Globus Pallidus pars interna from a human patient with Parkinson’s disease, and show that a time pattern coding and a rate coding scheme co-exist at different temporal scales, offering a new example of multiplexed neuronal coding.

  17. The "DOC" screen: Feasible and valid screening for depression, Obstructive Sleep Apnea (OSA and cognitive impairment in stroke prevention clinics.

    Directory of Open Access Journals (Sweden)

    Richard H Swartz

    Full Text Available Post-stroke Depression, Obstructive sleep apnea (OSA and Cognitive impairment ("DOC" are associated with greater mortality, worse recovery and poorer quality of life. Best practice recommendations endorse routine screening for each condition; yet, all are under-assessed, diagnosed and treated. We seek to determine the feasibility and validity of an integrated tool ("DOC" screen to identify stroke clinic patients at high-risk of depression, OSA, and cognitive impairment.All consecutive new referrals to a regional Stroke Prevention Clinic who were English-speaking and non-aphasic were eligible to be screened. Time for screen completion was logged. DOC screen results were compared to the neuropsychological battery and polysomnogram assessments using a modified receiver operator characteristic and area under the curve analysis. Data is reported to conform to STARD guidelines.1503 people were screened over 2 years. 89% of eligible patients completed the screen in 5 minutes or less (mean 4.2 minutes, less than half the time it takes to complete the Montreal Cognitive Assessment (MoCA. 437 people consented to detailed testing. Of those, 421 completed the Structured Clinical Interview for Depression within 3 months of screening, 387 completed detailed neuropsychological testing within 3 months, and 88 had overnight polysomnograms. Screening scores combined with demographic variables (age, sex, education, body mass index, had excellent validity compared to gold standard diagnoses: DOC-Mood AUC 0.90; DOC-Apnea AUC 0.80; DOC-Cog AUC 0.81. DOC screen scores can reliably categorize patients in to low-, intermediate- or high-risk groups for further action and can do so with comparable accuracy to more time-consuming screens.Systematic screening of depression, obstructive sleep apnea, and cognitive impairment in 5 minutes or less is feasible and valid in a high volume stroke clinic using the DOC screen. The DOC screen may facilitate improved identification and

  18. A Hyporheic Mesocosm Experiment: Influence of Quantity and Quality of stream-source DOC on Rates of Hyporheic Metabolism

    Science.gov (United States)

    Serchan, S. P.; Wondzell, S. M.; Haggerty, R.; Pennington, R.; Feris, K. P.; Sanfilippo, A. R.; Reeder, W. J.; Tonina, D.

    2016-12-01

    Hyporheic zone biogeochemical processes can influence stream water chemistry. Some estimates show that 50-90% stream water CO2 is produced in the hyporheic zone through heterotrophic metabolism of organic matter, usually supplied from the stream as dissolved organic carbon (DOC). Preliminary results from our well network at the HJ Andrews WS1, indicate that dissolved inorganic carbon (DIC) is 1.5-2 times higher in the hyporheic zone than in stream water. Conversely, DOC (mg/L) is 1.5 times higher in stream water than in the hyporheic zone throughout the year. Overall, the hyporheic zone appears to be a net source of DIC. However, the increase in DIC along hyporheic flow paths is approximately 10-times greater than the loss of DOC, suggesting that metabolism of buried particulate organic carbon (POC) is a major source of organic carbon for microbial metabolism. However, we cannot completely rule out alternative sources of DIC, especially those originating in the overlying riparian soil, because hyporheic processes are difficult to isolate in well networks. To study hyporheic zone biogeochemical processes, particularly the transformation of organic carbon to inorganic carbon species, we designed and built six replicate 2-m long hyporheic mesocosms in which we are conducting DOC amendment experiments. We examine the role of DOC quality and quantity on hyporheic respiration by injecting labile (acetate) and refractory (fulvic acid) organic carbon and comparing rates of O2 consumption, DOC loss, and DIC gains against a control. We expect that stream source DOC is limiting in this small headwater stream, forcing hyporheic metabolism to rely on buried POC. However, the long burial time of POC suggests it is likely of low quality so that supplying labile DOC in stream water should shift hyporheic metabolism away from POC rather than increase the overall rate of metabolism. Future experiments will examine natural sources of DOC (stream periphyton, leaf, and soil humic

  19. Cracking the code of oscillatory activity.

    Directory of Open Access Journals (Sweden)

    Philippe G Schyns

    2011-05-01

    Full Text Available Neural oscillations are ubiquitous measurements of cognitive processes and dynamic routing and gating of information. The fundamental and so far unresolved problem for neuroscience remains to understand how oscillatory activity in the brain codes information for human cognition. In a biologically relevant cognitive task, we instructed six human observers to categorize facial expressions of emotion while we measured the observers' EEG. We combined state-of-the-art stimulus control with statistical information theory analysis to quantify how the three parameters of oscillations (i.e., power, phase, and frequency code the visual information relevant for behavior in a cognitive task. We make three points: First, we demonstrate that phase codes considerably more information (2.4 times relating to the cognitive task than power. Second, we show that the conjunction of power and phase coding reflects detailed visual features relevant for behavioral response--that is, features of facial expressions predicted by behavior. Third, we demonstrate, in analogy to communication technology, that oscillatory frequencies in the brain multiplex the coding of visual features, increasing coding capacity. Together, our findings about the fundamental coding properties of neural oscillations will redirect the research agenda in neuroscience by establishing the differential role of frequency, phase, and amplitude in coding behaviorally relevant information in the brain.

  20. 'Ike Wai Professional Development Model for Students and Post-docs

    Science.gov (United States)

    Bruno, B. C.

    2016-12-01

    'Ike Wai: Securing Hawaii's Water Future, funded by NSF EPSCoR, is an interdisciplinary research collaboration among geophysicists, geochemists, engineers, microbiologists, computational modelers, data scientists and social scientists. Key questions include: How much water is there? How does it flow? How long will it last? Undergraduate students, graduate students and post-docs are actively involved in the research, and their professional development is a key part of the project. An underlying principle is that students assume responsibility for their own learning and professional development. Based on the model created by the NSF Center for Microbial Oceanography: Research and Education (C-MORE) (Bruno et al, 2008; Guannel et al 2014, Bottjer et al 2014), the 'Ike Wai professional development program includes (1) Leadership. Each student and post-doc creates an Individualized Professional Development plan, which includes leadership training (provided by external facilitators) and assuming leadership roles (such as developing and implementing trainings for their peers). (2) EDventures. Based on the C-MORE model, EDventures combines proposal-writing training with the incentive of seed money. Rather than providing training a priori, the EDventures model encourages students and post-docs to write a proposal based on guidelines provided. Training occurs during a two-stage review stage: proposers respond to panel reviews and resubmit their proposal within a single review cycle. C-MORE EDventures alumni self-report statistically significant confidence gains on all questions posed. Their subsequent proposal success is envious: of the 12 proposals submitted by to NSF, 50% were funded. (Wood Charlson & Bruno, 2015) (3) Layered Mentoring Network. All ´Ike Wai participants serve as both mentor and mentee. Students are matched with a non-research mentor in addition to their advisor to promote a holistic approach to career development. They will also serve as mentors to more

  1. Coding and English Language Teaching

    Science.gov (United States)

    Stevens, Vance; Verschoor, Jennifer

    2017-01-01

    According to Dudeney, Hockly, and Pegrum (2013) coding is a deeper skill subsumed under the four main digital literacies of language, connections, information, and (re)design. Coders or programmers are people who write the programmes behind everything we see and do on a computer. Most students spend several hours playing online games, but few know…

  2. QR Codes: Taking Collections Further

    Science.gov (United States)

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  3. Influence of the Good-Practice Principles and Codes in the Corporate Governance upon the Quality of the Financial-Accounting Information

    Directory of Open Access Journals (Sweden)

    Mirela Niculae

    2017-06-01

    Full Text Available This hereby works tries to identify the principles regarding the corporate governance and analyze the way in which such corporate governance templates influence the qualitative characteristics of the financial information. The execution of the governance means the obligation to implement the processes and structures corresponding to the management and the administration of the business and company’s operations, to provide their good operation. The final scope of the good corporate governance is to provide the efficiency, credibility and reliability of the organization. Solid governance means the settlement of certain fundamental principles defining the relationships between different actors clearly tracks the responsibilities and provides a correct operation of the decision making processes. Starting from the statement according to which the transparency and the quality of the corporate governance system are two concepts coexisting and relating intensively, associations among the characteristics of the shareholding and the information transparency level can be identified in the activity of the public interest entities. The relationships with the current or future investors, with the financers, with the investment analysts, with the media are of great importance for the accomplishment of the scopes related to the image and credibility of the public interest entity and mean a correct and fluent communication.

  4. Continuous Non-malleable Codes

    DEFF Research Database (Denmark)

    Faust, Sebastian; Mukherjee, Pratyay; Nielsen, Jesper Buus

    2014-01-01

    Non-malleable codes are a natural relaxation of error correcting/ detecting codes that have useful applications in the context of tamper resilient cryptography. Informally, a code is non-malleable if an adversary trying to tamper with an encoding of a given message can only leave it unchanged......-malleable codes where the adversary only is allowed to tamper a single time with an encoding. We show how to construct continuous non-malleable codes in the common split-state model where an encoding consist of two parts and the tampering can be arbitrary but has to be independent with both parts. Our main...... contributions are outlined below: We propose a new uniqueness requirement of split-state codes which states that it is computationally hard to find two codewords X = (X 0,X 1) and X′ = (X 0,X 1′) such that both codewords are valid, but X 0 is the same in both X and X′. A simple attack shows that uniqueness...

  5. On the MacWilliains identity for convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, Heide; Schneider, Gert

    The adjacency matrix associated with a convolutional code collects in a detailed manner information about the weight distribution of the code. A MacWilliams Identity Conjecture, stating that the adjacency matrix of a code fully determines the adjacency matrix of the dual code, will be formulated,

  6. Index coding via linear programming

    CERN Document Server

    Blasiak, Anna; Lubetzky, Eyal

    2010-01-01

    Index Coding has received considerable attention recently motivated in part by applications such as fast video-on-demand and efficient communication in wireless networks and in part by its connection to Network Coding. The basic setting of Index Coding encodes the side-information relation, the problem input, as an undirected graph and the fundamental parameter is the broadcast rate $\\beta$, the average communication cost per bit for sufficiently long messages (i.e. the non-linear vector capacity). Recent nontrivial bounds on $\\beta$ were derived from the study of other Index Coding capacities (e.g. the scalar capacity $\\beta_1$) by Bar-Yossef et al (FOCS'06), Lubetzky and Stav (FOCS'07) and Alon et al (FOCS'08). However, these indirect bounds shed little light on the behavior of $\\beta$ and its exact value remained unknown for \\emph{any graph} where Index Coding is nontrivial. Our main contribution is a hierarchy of linear programs whose solutions trap $\\beta$ between them. This enables a direct information-...

  7. Training course on code implementation.

    Science.gov (United States)

    Allain, A; De Arango, R

    1992-01-01

    The International Baby Food Action Network (IBFAN) is a coalition of over 40 citizen groups in 70 countries. IBFAN monitors the progress worldwide of the implementation of the International Code of Marketing of Breastmilk Substitutes. The Code is intended to regulate the advertising and promotional techniques used to sell infant formula. The 1991 IBFAN report shows that 75 countries have taken some action to implement the International Code. During 1992, the IBFAN Code Documentation Center in Malaysia conducted 2 training courses to help countries draft legislation to implement and monitor compliance with the International Code. In April, government officials from 19 Asian and African countries attended the first course in Malaysia; the second course was conducted in Spanish in Guatemala and attended by officials from 15 Latin American and Caribbean countries. The resource people included representatives from NGOs in Africa, Asia, Latin America, Europe and North America with experience in Code implementation and monitoring at the national level. The main purpose of each course was to train government officials to use the International Code as a starting point for national legislation to protect breastfeeding. Participants reviewed recent information on lactation management, the advantages of breastfeeding, current trends in breastfeeding and the marketing practices of infant formula manufacturers. The participants studied the terminology contained in the International Code and terminology used by infant formula manufacturers to include breastmilk supplements such as follow-on formulas and cereal-based baby foods. Relevant World Health Assembly resolutions such as the one adopted in 1986 on the need to ban free and low-cost supplies to hospitals were examined. The legal aspects of the current Baby Friendly Hospital Initiative (BFHI) and the progress in the 12 BFHI test countries concerning the elimination of supplies were also examined. International Labor

  8. Efficiency of a model human image code

    Science.gov (United States)

    Watson, Andrew B.

    1987-01-01

    Hypothetical schemes for neural representation of visual information can be expressed as explicit image codes. Here, a code modeled on the simple cells of the primate striate cortex is explored. The Cortex transform maps a digital image into a set of subimages (layers) that are bandpass in spatial frequency and orientation. The layers are sampled so as to minimize the number of samples and still avoid aliasing. Samples are quantized in a manner that exploits the bandpass contrast-masking properties of human vision. The entropy of the samples is computed to provide a lower bound on the code size. Finally, the image is reconstructed from the code. Psychophysical methods are derived for comparing the original and reconstructed images to evaluate the sufficiency of the code. When each resolution is coded at the threshold for detection artifacts, the image-code size is about 1 bit/pixel.

  9. Does DOM properties or the amount of DOC induces iron reduction in topsoil porewater?

    Science.gov (United States)

    Szalai, Zoltán; Ringer, Marianna; Kiss, Klaudia; Perényi, Katalin; Jakab, Gergely

    2017-04-01

    Iron content of porewater in hydromorphic soils shows high temporal variability. This usually correlates with dissolved organic carbon (DOC) content, but the correlation can be weak in some cases. Some studies suggest that ferrous iron stabilizes organic carbon in dissolved state. On the contrary, other papers report about dissolved iron stabilization by dissolved organic matter (DOM). Present study focuses on this apparent contradiction and on the interaction of organic carbon and iron in hydromorphic soils. Studied gleyic Phaeozems (3 profiles) and mollic Gleysols (3 profiles) are located in Geresdi-dombság (Hungary) and in Danube-Tisza Interfluve (Hungary) respectively. Dynamics of porewater pH, EH, have been recorded by field stations at 20, 40 and 100 cm depth during the growing season with 10 min temporal resolution. Porewater occasionally have also been sampled in each depth. The presence of ferrous iron was detected by dipyridil field test. DOC, dissolved nitrogen (DN) and iron were measured by TOC analyser and fl-AAS. Molecular size and molecular weight were measured by photon correlation spectroscope (DLS and SLS). Textural and mineralogical properties of studied soils were also determined. Relationships among studied parameters were tested by Spearman's rank correlation. The seasonal dynamics of redox potential is primarily controlled by saturation, but spatial differences are also driven by vegetation. The environment is usually reductive for iron oxides between March and July, but intensive daily redox fluctuations could be measured in June and July in some topsoils. Short term temporal variability of redox conditions is depended on the physiological activity of plants. Most of the papers published a range between +100 and +50 mV for iron reduction in aquatic systems. Topsoil porewater measurements show three redox ranges where concentration of dissolved iron has been increased: +320 to +200, +80 to +20 and below-160 mV. These ranges were identified

  10. Padre Osvaldo Carneiro Chaves: os caminhos da docÃncia.

    OpenAIRE

    Joan Edessom de Oliveira

    2006-01-01

    Trata da biografia do padre Osvaldo Carneiro Chaves, sacerdote nascido no municÃpio da Granja no ano de 1923 e que, durante quase trÃs dÃcadas, foi professor no SeminÃrio da BetÃnia e no ColÃgio Sobralense, em Sobral, CearÃ. Aborda a trajetÃria docente do padre Osvaldo, entre os anos de 1952 e 1980, quando o mesmo se aposentou como professor. ConstruÃdo a partir de entrevistas e depoimentos do padre Osvaldo e dos seus ex-alunos, traÃa um perfil da docÃncia desse personagem, buscando entender ...

  11. Google Docs as a Tool for Collaborative Writing in the Middle School Classroom

    Directory of Open Access Journals (Sweden)

    Yanan Fan

    2017-10-01

    Full Text Available Aim/Purpose: In this study, the authors examine how an online word processing tool can be used to encourage participation among students of different language back-grounds, including English Language Learners. To be exact, the paper discusses whether student participation in anonymous collaborative writing via Google Docs can lead to more successful products in a linguistically diverse eighth-grade English Language Arts classroom. Background: English Language Learners (ELLs make up a considerable portion of elementary and secondary public school students, as language and ethnic diversity has become the norm in the United States. The research literature finds that ELLs are statistically behind their monolingual peers on such key language and academic development indicators as writing. Educators and researchers then turn to collaborative writing with the assistance of online technology. Although it is shown in literature to be a worthwhile endeavor for students of all ages and ability levels, no studies have investigated the differences it makes, namely, in comparison to traditional face-to-face collaboration in the classroom, and to anonymous online collaboration in the virtual space. Methodology: Through face-to-face, online, and anonymous writing activities, a rubric, and a survey, this quantitative study asks if anonymous collaborative writing, com-pared to other modalities, equalizes participation among students of varying language fluencies, and if anonymous collaborative writing, compared to other modalities, affect student comfort levels. Contribution: This builds on research of online collaborative writing tools and suggests that using such tools (Google Docs in particular is beneficial, especially for students who are building their language abilities. The study further reveals varied degree of success and student comfort level in participating writing tasks in three modalities. Findings: We ascertain that students of varying language

  12. Temperature, DOC level and basin interactions explain the declining oxygen concentrations in the Bothnian Sea

    Science.gov (United States)

    Ahlgren, Joakim; Grimvall, Anders; Omstedt, Anders; Rolff, Carl; Wikner, Johan

    2017-06-01

    Hypoxia and oxygen deficient zones are expanding worldwide. To properly manage this deterioration of the marine environment, it is important to identify the causes of oxygen declines and the influence of anthropogenic activities. Here, we provide a study aiming to explain the declining oxygen levels in the deep waters of the Bothnian Sea over the past 20 years by investigating data from environmental monitoring programmes. The observed decline in oxygen concentrations in deep waters was found to be primarily a consequence of water temperature increase and partly caused by an increase in dissolved organic carbon (DOC) in the seawater (R2Adj. = 0.83) as well as inflow from the adjacent sea basin. As none of the tested eutrophication-related predictors were significant according to a stepwise multiple regression, a regional increase in nutrient inputs to the area is unlikely to explain a significant portion of the oxygen decline. Based on the findings of this study, preventing the development of anoxia in the deep water of the Bothnian Sea is dependent on the large-scale measures taken to reduce climate change. In addition, the reduction of the nutrient load to the Baltic Proper is required to counteract the development of hypoxic and phosphate-rich water in the Baltic Proper, which can form deep water in the Bothnian Sea. The relative importance of these sources to oxygen consumption is difficult to determine from the available data, but the results clearly demonstrate the importance of climate related factors such as temperature, DOC and inflow from adjacent basins for the oxygen status of the sea.

  13. Carbon isotopes in peat, DOC, CO2, and CH4 in a Holocene peatland on Dartmoor, southwest England

    Science.gov (United States)

    Charman, Dan J.; Aravena, Ramon; Bryant, Charlotte L.; Harkness, Doug D.

    1999-06-01

    Carbon gases with younger 14C ages than those of the surrounding peat have been reported from continental boreal peatlands, a fact which suggests that significant movement of CO2, CH4, or DOC (dissolved organic carbon) and export of C via subsurface processes are not accounted for in most estimates of contributions to the C cycle. This paper tests the hypothesis that similar processes can occur in oceanic ombrotrophic mires where water and gas movement is theoretically minimal. Measurements of 14C and δ13C in CO2, CH4, and DOC, and of tritium, are reported from depths to 250 cm at Tor Royal, a raised mire in southwest England. Radiocarbon ages of gases are 1460 to 500 yr younger than those of peat from the same depths, and CO2 is consistently younger than CH4. DOC is 1260 to 830 yr younger than the peat, and significant amounts of tritium were found at all depths. Gas ages are mostly intermediate between the age of the peat and that of the DOC, which suggests that C is principally transported as DOC. However, some gases are younger than their associated DOC, which implies that movement of dissolved gases may also take place. δ13C values in gases suggest that CO2 reduction is the major pathway for CH4 production. Transport of C in deep peats is likely to be a significant component in the overall C budget of ombrotrophic oceanic peatlands, and C export via discharge to ground or surface waters may be an important mechanism for gaseous C emissions.

  14. Using a Whole-stream Approach to Quantify Headwater Yedoma DOC Processing Rates in NE Siberia, Russia

    Science.gov (United States)

    Heslop, J.; Walter Anthony, K. M.; Davydova, A.; Davydov, S. P.; Zimov, N.

    2015-12-01

    Climate warming triggers the release of permafrost organic carbon (OC) via permafrost thaw and erosion, exporting large amounts of terrestrial C to aquatic environments and making previously frozen OC from a range of soil depths available for microbial processing. It is estimated 210-476 Pg C is stored in deep, ice-rich loess-dominated soils referred to as yedoma. Yedoma is extensive in NE Siberia and Alaska, where it underlies an area of over 1,000,000 km2 and averages 25 m in thickness. Recent research suggests ancient (Pleistocene-aged) permafrost OC, such as yedoma OC, is rapidly and preferentially utilized by microbial communities in Arctic headwater streams. We utilized a combination of short-term laboratory incubations and a whole-stream approach to examine permafrost-derived dissolved organic carbon (DOC) uptake, processing, and transport rates in a small stream which drains yedoma uplands in Cherskii, NE Siberia. Short-term incubations were conducted on permafrost leachates mixed with stream water to quantify microbial processing rates of permafrost-derived DOC from leachates made with surface (0-15 cm), shallow (70-100 cm), and deep (nutrient-release experiments was characterized using absorbance measurements (SUVA254 and SR) and florescence spectrometry (EEMs) to quantify how DOC composition correlates to and changes with permafrost DOC bioavailability and processing parameters. Preliminary results suggest DOC processing rates may be highest in leachates made from surface sediments, which receive fresh OC input from modern ecosystems, and from deep sediments, which contain ancient, previously immobile OC. Shallow permafrost OC, which experiences degradation from annual freeze-thaw cycles without receiving fresh OC input, may have the lowest DOC processing rates.

  15. Peripheral coding of taste

    Science.gov (United States)

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  16. [Neural codes for perception].

    Science.gov (United States)

    Romo, R; Salinas, E; Hernández, A; Zainos, A; Lemus, L; de Lafuente, V; Luna, R

    This article describes experiments designed to show the neural codes associated with the perception and processing of tactile information. The results of these experiments have shown the neural activity correlated with tactile perception. The neurones of the primary somatosensory cortex (S1) represent the physical attributes of tactile perception. We found that these representations correlated with tactile perception. By means of intracortical microstimulation we demonstrated the causal relationship between S1 activity and tactile perception. In the motor areas of the frontal lobe is to be found the connection between sensorial and motor representation whilst decisions are being taken. S1 generates neural representations of the somatosensory stimuli which seen to be sufficient for tactile perception. These neural representations are subsequently processed by central areas to S1 and seem useful in perception, memory and decision making.

  17. DOC and POC in the water column of the southern Baltic. Part I. Evaluation of factors influencing sources, distribution and concentration dynamics of organic matter

    Directory of Open Access Journals (Sweden)

    Anna Maciejewska

    2014-06-01

    Full Text Available The aim of the study was to address questions regarding the vertical, horizontal and seasonal dynamics of both DOC and POC in the Baltic Sea and the factors influencing carbon concentrations. In general, the highest concentrations of both DOC and POC were recorded in the surface water layer (DOC ~4.7 mg dm-3, POC ~0.6 mg dm-3 as a consequence of intensive phytoplankton activity, and in the halocline layer (DOC ~5.1 mg dm-3, POC ~0.4 mg dm-3. The lowest DOC and POC concentrations were measured in the sub-halocline water layer, where the values did not exceed 3.5 mg dm-3 (DOC and 0.1 mg dm-3 (POC. Seasonally, the highest DOC and POC concentrations were measured during the growing season: surface DOC ~5.0 mg dm-3; sub-halocline DOC ~4.1 mg dm-3 and surface POC ~0.9 mg dm-3, sub-halocline POC ~0.2 mg dm-3. The ANOVA Kruskal-Wallis test results indicate statistically significant differences among the three study sites regarding average concentrations, and concentrations in particular water layers and seasons. It shows that concentrations of DOC and POC differ in sub-basins of the Baltic Sea. The differences were attributed to the varying distances from river mouths to study sites or the different starting times and/or durations of the spring algal blooms. Statistically significant dependences were found between both DOC and POC concentrations and Chl a (phytoplankton biomass, pH (phytoplankton photosynthetic rate, pheo (zooplankton sloppy feeding, salinity (river run-off and North Sea water inflows and water temperature (season. This was taken as proof that these factors influence DOC and POC in the study areas.

  18. Utility of QR codes in biological collections.

    Science.gov (United States)

    Diazgranados, Mauricio; Funk, Vicki A

    2013-01-01

    The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.

  19. Utility of QR codes in biological collections

    Directory of Open Access Journals (Sweden)

    Mauricio Diazgranados

    2013-07-01

    Full Text Available The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.

  20. TIPONLINE Code Table

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coded items are entered in the tiponline data entry program. The codes and their explanations are necessary in order to use the data

  1. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  2. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  3. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  4. On Coding Non-Contiguous Letter Combinations

    Directory of Open Access Journals (Sweden)

    Frédéric eDandurand

    2011-06-01

    Full Text Available Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity.

  5. Code-Mixing and Code-Switching of Indonesian Celebrities: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Nana Yuliana

    2015-05-01

    Full Text Available Foreign language skill presents a language variety called code-mixing and code-switching. The purpose of this study was to get some information to identify the types of code mixing and code switching frequently used by Indonesian celebrities. The study was divided into two groups. Group I was inclusive of the celebrities with native speakers parents and Group II comprised celebrities capable of speaking two or more languages. The qualitative and quantitative methods were used to analyze the code mixing and code switching with different frequency. It can be concluded that Group II use code-mixing and code-switching with a different frequency and speak foreign language more active.

  6. A concatenation scheme of LDPC codes and source codes for flash memories

    Science.gov (United States)

    Huang, Qin; Pan, Song; Zhang, Mu; Wang, Zulin

    2012-12-01

    Recently, low-density parity-check (LDPC) codes have been applied in flash memories to correct errors. However, as verified in this article, their performance degrades rapidly as the number of stuck cells increases. Thus, this paper presents a concatenation reliability scheme of LDPC codes and source codes, which aims to improve the performance of LDPC codes for flash memories with stuck cells. In this scheme, the locations of stuck cells is recorded by source codes in the write process such that erasures rather than wrong log-likelihood ratios on these cells are given in the read process. Then, LDPC codes correct these erasures and soft errors caused by cell-to-cell interferences. The analyses of channel capacity and compression rates of source codes with side information show that the memory cost of the proposed scheme is moderately low. Simulation results verify that the proposed scheme outperforms the traditional scheme with only LDPC codes.

  7. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  8. Multi-hypothesis distributed stereo video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    Distributed Video Coding (DVC) is a video coding paradigm that exploits the source statistics at the decoder based on the availability of the Side Information (SI). Stereo sequences are constituted by two views to give the user an illusion of depth. In this paper, we present a DVC decoder...

  9. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  10. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  11. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  12. FLOWTRAN-TF code description

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P. (ed.)

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  13. FLOWTRAN-TF code description

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P. (ed.)

    1991-09-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  14. The Procions` code; Le code Procions

    Energy Technology Data Exchange (ETDEWEB)

    Deck, D.; Samba, G.

    1994-12-19

    This paper presents a new code to simulate plasmas generated by inertial confinement. This multi-kinds kinetic code is done with no angular approximation concerning ions and will work in plan and spherical geometry. First, the physical model is presented, using Fokker-Plank. Then, the numerical model is introduced in order to solve the Fokker-Plank operator under the Rosenbluth form. At the end, several numerical tests are proposed. (TEC). 17 refs., 27 figs.

  15. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  16. SMART DOCS: A New Patient-Centered Outcomes and Coordinated-Care Management Approach for the Future Practice of Sleep Medicine

    Science.gov (United States)

    Kushida, Clete A.; Nichols, Deborah A.; Holmes, Tyson H.; Miller, Ric; Griffin, Kara; Cardell, Chia-Yu; Hyde, Pamela R.; Cohen, Elyse; Manber, Rachel; Walsh, James K.

    2015-01-01

    The practice of medicine is currently undergoing a transformation to become more efficient, cost-effective, and patient centered in its delivery of care. The aim of this article is to stimulate discussion within the sleep medicine community in addressing these needs by our approach as well as other approaches to sleep medicine care. The primary goals of the Sustainable Methods, Algorithms, and Research Tools for Delivering Optimal Care Study (SMART DOCS) are: (1) to introduce a new Patient-Centered Outcomes and Coordinated-Care Management (PCCM) approach for the future practice of sleep medicine, and (2) to test the PCCM approach against a Conventional Diagnostic and Treatment Outpatient Medical Care (CONV) approach in a randomized, two-arm, single-center, long-term, comparative effectiveness trial. The PCCM approach is integrated into a novel outpatient care delivery model for patients with sleep disorders that includes the latest technology, allowing providers to obtain more accurate and rapid diagnoses and to make evidence-based treatment recommendations, while simultaneously enabling patients to have access to personalized medical information and reports regarding their diagnosis and treatment so that they can make more informed health care decisions. Additionally, the PCCM approach facilitates better communication between patients, referring primary care physicians, sleep specialists, and allied health professionals so that providers can better assist patients in achieving their preferred outcomes. A total of 1,506 patients 18 y or older will be randomized to either the PCCM or CONV approach and will be followed for at least 1 y with endpoints of improved health care performance, better health, and cost control. Clinical Trials Registration: ClinicalTrials.gov Identifier: NCT02037438. Citation: Kushida CA, Nichols DA, Holmes TH, Miller R, Griffin K, Cardell CY, Hyde PR, Cohen E, Manber R, Walsh JK. SMART DOCS: a new patient-centered outcomes and coordinated

  17. The SHIELD11 Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W

    2005-02-02

    SHIELD11 is a computer code for performing shielding analyses around a high-energy electron accelerator. It makes use of simple analytic expressions for the production and attenuation of photons and neutrons resulting from electron beams striking thick targets, such as dumps, stoppers, collimators, and other beam devices. The formulae in SHIELD11 are somewhat unpretentious in that they are based on the extrapolation (scaling) of experimental data using rather simple physics ideas. Because these scaling methods have only been tested over a rather limited set of conditions--namely, 1-15 GeV electrons striking 10-20 radiation lengths of iron--a certain amount of care and judgment must be exercised whenever SHIELD11 is used. Nevertheless, for many years these scaling methods have been applied rather successfully to a large variety of problems at SLAC, as well as at other laboratories throughout the world, and the SHIELD11 code has been found to be a fast and convenient tool. In this paper we present, without extensive theoretical justification or experimental verification, the five-component model on which the SHIELD11 code is based. Our intent is to demonstrate how to use the code by means of a few simple examples. References are provided that are considered to be essential for a full understanding of the model. The code itself contains many comments to provide some guidance for the informed user, who may wish to improve on the model.

  18. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  19. A Distributed Quaternary Turbo Coded Cooperative Scheme

    Directory of Open Access Journals (Sweden)

    BALDINI FILHO, R.

    2014-12-01

    Full Text Available Cooperative communications achieve MIMO-like diversity gains by introducing a relay that creates an independent faded path between the source and the destination. Coded cooperation integrates cooperation with channel coding in order to increase the bit error rate (BER performance of cooperative communications. Turbo codewords can be built efficiently at the destination using encoded portions of the information sent by the source and the relay. This paper presents a distributed turbo cooperative coding scheme that utilizes convolutional codes defined over the finite ring of integers Z4 that performs better than its equivalent binary counterparts.

  20. Dissolved organic carbon (DOC) in soil extracts investigated by FT-ICR-MS

    Science.gov (United States)

    Hofmann, D.; Steffen, D.; Jablonowski, N. D.; Burauel, P.

    2012-04-01

    Soil drying and rewetting usually increases the release of xenobiotics like pesticides present in agricultural soils. Besides the effect on the release of two aged 14C-labeled pesticide residues we focus on the characterisation of simultaneously remobilized dissolved organic carbon (DOC) to gain new insights into structure and stability aspects of soil organic carbon fractions. The test soil (gleyic cambisol; Corg 1.2%, pH 7.2) was obtained from the upper soil layer of two individual outdoor lysimeter studies containing either environmentally long-term aged 14C residues of the herbicide ethidimuron (0-10 cm depth; time of aging: 9 years) or methabenzthiazuron (0-30 cm depth; time of aging: 17 years). Soil samples (10 g dry soil equivalents) were (A=dry/wet) previously dried (45°C) or (B=wet/wet) directly mixed with pure water (1+2, w:w), shaken (150 rpm, 1 h), and centrifuged (2000 g). This extraction procedure was repeated several individual times, for both setups. The first three individual extractions, respectively were used for further investigations. Salt was removed from samples prior analysis because of a possible quench effect in the electrospray (ESI) source by solid phase extraction (SPE) with Chromabond C18 Hydra-cartridges (Macherey-Nagel) and methanol as backextraction solvent. The so preconcentrated and desalted samples were introduced by flow injection analysis (FIA) in a fourier transform ion cyclotron resonance mass spectrometer (FT-ICR-MS), equipped with an ESI source and a 7 T supra-conducting magnet (LTQ-FT Ultra, ThermoFisher Scientific). This technique is the key technique for complex natural systems attributed by their outstanding mass resolution (used 400.000 at m/z 400 Da) and mass accuracy (≤ 1ppm) by simultaneously providing molecular level details of thousands of compounds and was successful applied for the investigations of natural organic matter (NOM) different sources like marine and surface water, soil, sediment, bog and crude oil

  1. Aprendizaje colaborativo con Google Docs y Chat en el Aula de Alemán como Lengua Extranjera

    OpenAIRE

    Gil Salom, Daniela Teresa

    2014-01-01

    Gil Salom, DT. (2014). Aprendizaje colaborativo con Google Docs y Chat en el Aula de Alemán como Lengua Extranjera. En Studies in Philology: Linguistics, Literature and Culture Studies in Modern Languages. 83-94. http://hdl.handle.net/10251/60710 Senia 83 94

  2. Integrated bicarbonate-form ion exchange treatment and regeneration for DOC removal: Model development and pilot plant study.

    Science.gov (United States)

    Hu, Yue; Boyer, Treavor H

    2017-05-15

    The application of bicarbonate-form anion exchange resin and sodium bicarbonate salt for resin regeneration was investigated in this research is to reduce chloride ion release during treatment and the disposal burden of sodium chloride regeneration solution when using traditional chloride-form ion exchange (IX). The target contaminant in this research was dissolved organic carbon (DOC). The performance evaluation was conducted in a completely mixed flow reactor (CMFR) IX configuration. A process model that integrated treatment and regeneration was investigated based on the characteristics of configuration. The kinetic and equilibrium experiments were performed to obtain required parameters for the process model. The pilot plant tests were conducted to validate the model as well as provide practical understanding on operation. The DOC concentration predicted by the process model responded to the change of salt concentration in the solution, and showed a good agreement with pilot plant data with less than 10% difference in terms of percentage removal. Both model predictions and pilot plant tests showed over 60% DOC removal by bicarbonate-form resin for treatment and sodium bicarbonate for regeneration, which was comparable to chloride-form resin for treatment and sodium chloride for regeneration. Lastly, the DOC removal was improved by using higher salt concentration for regeneration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Amounts of carbon mineralised and leached as DOC during decomposition of Norway spruce needles and fine roots

    NARCIS (Netherlands)

    Hansson, K.; Berggren Kleja, D.; Kalbitz, K.; Larsson, H.

    2010-01-01

    Changes in climate or forest management practices leading to increased litter production will most likely cause increased leaching rates of dissolved organic carbon (DOC) from the O horizon. The rhizosphere is often assumed to have a large carbon flux associated with root turnover and exudation.

  4. Social Constructivist Approach to Web-Based EFL Learning: Collaboration, Motivation, and Perception on the Use of Google Docs

    Science.gov (United States)

    Liu, Sarah Hsueh-Jui; Lan, Yu-Ju

    2016-01-01

    This study reports on the differences in motivation, vocabulary gain, and perceptions on using or the Google Docs between individual and collaborative learning at a tertiary level. Two classes of English-as-a-Foreign Language (EFL) students were recruited and each class was randomly assigned into one of the two groups--individuals or…

  5. Exploring the Effects of Employing Google Docs in Collaborative Concept Mapping on Achievement, Concept Representation, and Attitudes

    Science.gov (United States)

    Lin, Yu-Tzu; Chang, Chia-Hu; Hou, Huei-Tse; Wu, Ke-Chou

    2016-01-01

    This study investigated the effectiveness of using Google Docs in collaborative concept mapping (CCM) by comparing it with a paper-and-pencil approach. A quasi-experimental study was conducted in a physics course. The control group drew concept maps using the paper-and-pencil method and face-to-face discussion, whereas the experimental group…

  6. Using Google Docs to Enhance the Teacher Work Sample: Building e-Portfolios for Learning and Practice

    Science.gov (United States)

    Gugino, Jessica

    2018-01-01

    The use of teaching portfolios in teacher education programs is a widely accepted practice. This article describes how a traditional teacher work sample was transformed using the online platform, Google Docs. The use of online digital portfolios may help to satisfy both the need to evaluate teacher candidates' performance in special education…

  7. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  8. Noisy Network Coding

    CERN Document Server

    Lim, Sung Hoon; Gamal, Abbas El; Chung, Sae-Young

    2010-01-01

    A noisy network coding scheme for sending multiple sources over a general noisy network is presented. For multi-source multicast networks, the scheme naturally extends both network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to general discrete memoryless and Gaussian networks. The scheme also recovers as special cases the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves message repetition coding, relay signal compression, and simultaneous decoding. Unlike previous compress--forward schemes, where independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner--Ziv binning as in previous compress-forward sch...

  9. Experimental study on the particulate matter and nitrogenous compounds from diesel engine retrofitted with DOC+CDPF+SCR

    Science.gov (United States)

    Zhang, Yunhua; Lou, Diming; Tan, Piqiang; Hu, Zhiyuan

    2018-03-01

    The increasingly stringent emission regulations will mandate the retrofit of after-treatment devices for in-use diesel vehicles, in order to reduce their substantial particulate matter and nitrogen oxides (NOX) emissions. In this paper, a combination of DOC (diesel oxidation catalyst), CDPF (catalytic diesel particulate filter) and SCR (selective catalytic reduction) retrofit for a heavy-duty diesel engine was employed to perform experiment on the engine test bench to evaluate the effects on the particulate matter emissions including particle number (PN), particle mass (PM), particle size distributions and nitrogenous compounds emissions including NOX, nitrogen dioxide (NO2)/NOX, nitrous oxide (N2O) and ammonia (NH3) slip. In addition, the urea injection was also of our concern. The results showed that the DOC+CDPF+SCR retrofit almost had no adverse effect on the engine power and fuel consumption. Under the test loads, the upstream DOC and CDPF reduced the PN and PM by an average of 91.6% and 90.9%, respectively. While the downstream SCR brought about an average decrease of 85% NOX. Both PM and NOX emission factors based on this retrofit were lower than China-Ⅳ limits (ESC), and even lower than China-Ⅴ limits (ESC) at medium and high loads. The DOC and CDPF changed the particle size distributions, leading to the increase in the proportion of accumulation mode particles and the decrease in the percentage of nuclear mode particles. This indicates that the effect of DOC and CDPF on nuclear mode particles was better than that of accumulation mode ones. The upstream DOC could increase the NO2/NOX ratio to 40%, higher NO2/NOX ratio improved the efficiency of CDPF and SCR. Besides, the N2O emission increased by an average of 2.58 times after the retrofit and NH3 slip occurred with the average of 26.7 ppm. The rate of urea injection was roughly equal to 8% of the fuel consumption rate. The DOC+CDPF+SCR retrofit was proved a feasible and effective measurement in terms

  10. A mean field theory of coded CDMA systems

    Energy Technology Data Exchange (ETDEWEB)

    Yano, Toru [Graduate School of Science and Technology, Keio University, Hiyoshi, Kohoku-ku, Yokohama-shi, Kanagawa 223-8522 (Japan); Tanaka, Toshiyuki [Graduate School of Informatics, Kyoto University, Yoshida Hon-machi, Sakyo-ku, Kyoto-shi, Kyoto 606-8501 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)], E-mail: yano@thx.appi.keio.ac.jp

    2008-08-15

    We present a mean field theory of code-division multiple-access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean-field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.

  11. Fluorescence measured in situ as a proxy of CDOM absorption and DOC concentration in the Baltic Sea

    Directory of Open Access Journals (Sweden)

    Piotr Kowalczuk

    2010-09-01

    Full Text Available This study presents results from field surveys performed in 2008 and 2009 in the southern Baltic in different seasons. The main goal of these measurements was to identify the empirical relationships between DOM optical properties and DOC. CDOM absorption and fluorescence and DOC concentrations were measured during thirteen research cruises. The values of the CDOM absorption coefficient at 370 nm aCDOM(370 ranged from 0.70 m-1 to 7.94 m-1, and CDOM fluorescence intensities (ex./em. 370/460 IFl, expressed in quinine sulphate equivalent units, ranged from 3.88 to 122.97 (in filtered samples. Dissolved organic carbon (DOC concentrations ranged from 266.7 to 831.7 µM C. There was a statistically significant linear relationship between the fluorescence intensity measured in the filtered samples and the CDOM absorption coefficient aCDOM(370, R2 = 0.87. There was much more scatter in the relationship between the fluorescence intensity measured in situ (i.e. in unprocessed water samples and the CDOM absorption coefficient aCDOM(370, resulting in a slight deterioration in the coefficient of determination R2 = 0.85. This indicated that the presence of particles could impact fluorometer output during in situ deployment. A calibration experiment was set up to quantify particle impact on the instrument output in raw marine water samples relative to readings from filtered samples. The bias calculated for the absolute percentage difference between fluorescence intensities measured in raw and filtered water was low (-2.05%, but the effect of particle presence expressed as the value of the RMSE was significant and was as high as 35%. Both DOM fluorescence intensity (in raw water and filtered samples and the CDOM absorption coefficient aCDOM(370 are highly correlated with DOC concentration. The relationship between DOC and the CDOM absorption coefficient aCDOM(370 was better (R2 = 0.76 than the relationship between DOC and the respective fluorescence intensities

  12. Hydrologic and forest management controls on DOC dynamics in the small watersheds of the H.J. Andrews Experimental Forest, OR

    Science.gov (United States)

    Lajtha, K.; Jones, J. A.

    2016-12-01

    Dissolved organic carbon (DOC) export from hillslopes to streams is an important component of the carbon cycle of a catchment and may be a critical source of energy for the aquatic food web in receiving waters. Using a long-term record of DOC and other dissolved nutrients and elements from paired watersheds from the H.J. Andrews Experimental Forest in Oregon, we explored hydrologic, climatic, and land-use controls on seasonal and inter-annual patterns of DOC flux in a seasonally dry ecosystem. Seasonal patterns of DOC flux demonstrated source limitations to DOC export, with DOC concentrations highest immediately following the first rains after a dry summer, and lowest after winter rains. In contrast, more geochemically-controlled elements showed simple dilution-concentration patterns with no seasonal hysteresis. Inter-annual patterns of DOC flux, however, did not provide evidence of source limitation, with DOC flux within a watershed tightly correlated to total discharge but not temperature. Among watersheds, forest harvest, even over 50 years ago, significantly reduced DOC flux but not fluxes of other elements including N; this response was linked to the loading of coarse woody debris to the forest floor. Chemical fingerprinting of DOC revealed that old-growth watersheds had higher fluxes of DOC characteristic of forest floor organic materials, likely delivered to streams through more surficial preferential flow pathways not subject to microbial alteration, respiration, or sorption losses. Taken together these results suggest that the biogeochemical composition of forested streams reflects both current hydrologic patterns and also processes that occurred many decades ago within the catchment.

  13. Effects of ozone as a stand-alone and coagulation-aid treatment on the reduction of trihalomethanes precursors from high DOC and hardness water.

    Science.gov (United States)

    Sadrnourmohamadi, Mehrnaz; Gorczyca, Beata

    2015-04-15

    This study investigates the effect of ozone as a stand-alone and coagulation aid on the removal of dissolved organic carbon (DOC) from the water with a high level of DOC (13.8 mgL(-1)) and calcium hardness (270 mgL(-1)) CaCO3. Natural water collected from the Assiniboine River (Manitoba, Canada) was used in this study. Effectiveness of ozone treatment was evaluated by measurement of DOC, DOC fractions, UV254, and trihalomethane formation potential (THMFP). Additionally, zeta potential and dissolved calcium concentration were measured to discern the mechanism of ozone reactions. Results indicated that 0.8 mg O3/mg DOC ozone stand-alone can cause up to 86% UV254 reduction and up to 27% DOC reduction. DOC fractionation results showed that ozone can change the composition of DOC in the water samples, converting the hydrophobic fractions into hydrophilic ones and resulting in the reduction of THMFP. Also, ozone caused a decrease in particle stability and dissolved calcium concentration. These simultaneous ozonation effects caused improved water flocculation and enhanced removal of DOC. This resulted in reduction of the coagulant dosage when ozone doses higher than 0.2 mg O3/mg DOC were applied prior to coagulation with ferric sulfate. Also, pre-ozonation-coagulation process achieved preferential THMFP removal for all of the ozone doses tested (0-0.8 mg O3/mg DOC), leading to a lower specific THMFP in pre-ozonated-coagulated waters than in the corresponding ozonated waters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Convergence Analysis of Turbo Decoding of Serially Concatenated Block Codes and Product Codes

    Directory of Open Access Journals (Sweden)

    Krause Amir

    2005-01-01

    Full Text Available The geometric interpretation of turbo decoding has founded a framework, and provided tools for the analysis of parallel-concatenated codes decoding. In this paper, we extend this analytical basis for the decoding of serially concatenated codes, and focus on serially concatenated product codes (SCPC (i.e., product codes with checks on checks. For this case, at least one of the component (i.e., rows/columns decoders should calculate the extrinsic information not only for the information bits, but also for the check bits. We refer to such a component decoder as a serial decoding module (SDM. We extend the framework accordingly, derive the update equations for a general turbo decoder of SCPC, and the expressions for the main analysis tools: the Jacobian and stability matrices. We explore the stability of the SDM. Specifically, for high SNR, we prove that the maximal eigenvalue of the SDM's stability matrix approaches , where is the minimum Hamming distance of the component code. Hence, for practical codes, the SDM is unstable. Further, we analyze the two turbo decoding schemes, proposed by Benedetto and Pyndiah, by deriving the corresponding update equations and by demonstrating the structure of their stability matrices for the repetition code and an SCPC code with information bits. Simulation results for the Hamming and Golay codes are presented, analyzed, and compared to the theoretical results and to simulations of turbo decoding of parallel concatenation of the same codes.

  15. Code of ethics for dental researchers.

    Science.gov (United States)

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful.

  16. Fountain codes for frequency occupancy information dissemination

    NARCIS (Netherlands)

    Shao, X.; Cronie, H.S.; Hoeksema, F.W.; Slump, Cornelis H.

    2006-01-01

    Cognitive radio (CR) is defined as an intelligent wireless communication system based on secondary utilization of an already licensed frequency band. In order to communicate without interfering the legal users (primary users), cognitive radio nodes should have the same overview of the spectrum

  17. The duality of coding assessment information

    African Journals Online (AJOL)

    Erna Kinsey

    The challenges and activities of outcomes-based education and very often the beauty of this 'new' approach are often overshadowed by the realities of the classroom and the difficulties of assessment. One of the greatest problems concerning outcomes that address knowledge, skills and values is to determine and qualify ...

  18. Brazilian cross-cultural adaptation of the DocCom online module: communication for teamwork

    Directory of Open Access Journals (Sweden)

    Tatiane Angélica Phelipini Borges

    2017-09-01

    Full Text Available ABSTRACT Objective: to carry out the cross-cultural adaptation of DocCom online module 38, which deals with teamwork communication into Portuguese for the Brazilian contexto. Method: the transcultural translation and adaptation were accomplished through initial translations, synthesis of the translations, evaluation and synthesis by a committee of experts, analysis by translators and back translation, pre-test with nurses and undergraduate students in Nursing, and analysis of the translators to obtain the final material. Results: in evaluation and synthesis of the translated version with the original version by the expert committee, the items obtained higher than 80% agreement. Few modifications were suggested according to the analysis by pretest participants. The final version was adequate to the proposed context and its purpose. Conclusion: it is believed that by making this new teaching-learning strategy of communication skills and competencies for teamwork available, it can be used systematically in undergraduate and postgraduate courses in the health area in Brazil in order to contribute to training professionals, and also towards making advances in this field.

  19. Brazilian cross-cultural adaptation of the DocCom online module: communication for teamwork 1

    Science.gov (United States)

    Borges, Tatiane Angélica Phelipini; Vannuchi, Marli Terezinha Oliveira; Grosseman, Suely; González, Alberto Durán

    2017-01-01

    ABSTRACT Objective: to carry out the cross-cultural adaptation of DocCom online module 38, which deals with teamwork communication into Portuguese for the Brazilian contexto. Method: the transcultural translation and adaptation were accomplished through initial translations, synthesis of the translations, evaluation and synthesis by a committee of experts, analysis by translators and back translation, pre-test with nurses and undergraduate students in Nursing, and analysis of the translators to obtain the final material. Results: in evaluation and synthesis of the translated version with the original version by the expert committee, the items obtained higher than 80% agreement. Few modifications were suggested according to the analysis by pretest participants. The final version was adequate to the proposed context and its purpose. Conclusion: it is believed that by making this new teaching-learning strategy of communication skills and competencies for teamwork available, it can be used systematically in undergraduate and postgraduate courses in the health area in Brazil in order to contribute to training professionals, and also towards making advances in this field.

  20. A Mixed Methods Analysis of the Effect of Google Docs Environment on EFL Learners' Writing Performance and Causal Attributions for Success and Failure

    Science.gov (United States)

    Seyyedrezaie, Zari Sadat; Ghonsooly, Behzad; Shahriari, Hesamoddin; Fatemi, Hazar Hosseini

    2016-01-01

    This study investigated the effect of writing process in Google Docs environment on Iranian EFL learners' writing performance. It also examined students' perceptions towards the effects of Google Docs and their perceived causes of success or failure in writing performance. In this regard, 48 EFL students were chosen based on their IELTs writing…

  1. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  2. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded......We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when....... Instead we propose that the promise of coding-based research lies in its ability to open up new qualitative questions....

  3. Overview of Code Verification

    Science.gov (United States)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  4. Wspomnienie o doc. dr hab. med. Wandzie Szymańskiej-Jagiełło oraz o doc. dr hab. med. Grażynie Gutowskiej-Grzegorczyk

    Directory of Open Access Journals (Sweden)

    Katarzyna Rostropowicz-Denisiewicz

    2010-08-01

    ły wiele lat, aż do wieku dorosłego, co pozwoliło autorce na ocenę dalszegorokowania dotyczącego późnych następstw choroby, zależnych od jej postaci. Wandzia wykazywałasamodzielność i inicjatywę w prowadzeniu prac naukowych. Nawiązała liczne kontakty – Jej współpracaz Kliniką Ortopedii IR (doc. dr hab. med. Sylwester Jakubowski i dr n. med. Janina Ruszczyńska zaowocowałanowoczesnymi publikacjami poświęconymi metodom i wynikom leczenia operacyjnego chorychw wieku rozwojowym. Zwrócono szczególną uwagę na stawy biodrowe – jako jedną ze złośliwych lokalizacjichoroby. Uzyskane wyniki badań wyprzedzały niekiedy podobne badania przeprowadzone w innychkrajach europejskich, o czym przekonano się w czasie Sympozjum EULAR-u Pediatrycznego w Oslow 1976 r. Współpraca z Zakładem Stomatologii IR (dr n. med. Krystyna Drecka i Kliniką StomatologiiZachowawczej (ówczesnej AM w Warszawie dotyczyła innej złośliwej lokalizacji chorób, tj. stawów skroniowo-żuchwowych, doprowadzającej do ciężkiego kalectwa. Opracowano metody wczesnego rozpoznawaniatych zmian, profilaktyki i leczenia. Wyniki tych prac opublikowano i opracowano w postaci instrukcji.Zostały one wyróżnione licznymi nagrodami. Badania prowadzone wspólnie z Zakładem Okulistyki IR(dr n. med. Zofia Jaczynowska miały na celu wczesne wykrycie zmian zapalnych w narządzie wzrokuprzebiegających skrycie (tzw. zimne zapalenie błony naczyniowej, prowadzące do poważnych, nieodwracalnychzmian, łącznie ze ślepotą. Zajęcie narządu wzroku, o łagodnym jednostawowym początku choroby,dotyczyło przeważnie małych dzieci. We współpracy z Kliniką Dermatologiczną ówczesnej AMw Warszawie (prof. Stefania Jabłońska prowadziła badania nad różnymi postaciami i odmianami twardzinyu dzieci. Jej doświadczenie sprawiało, że była najlepszym specjalistą w tej dziedzinie. Badania te sąkontynuowane przez Jej następców we współpracy z ośrodkami zagranicznymi. Ca

  5. Non-linear, connectivity and threshold-dominated runoff-generation controls DOC and heavy metal export in a small peat catchment

    Science.gov (United States)

    Birkel, Christian; Broder, Tanja; Biester, Harald

    2017-04-01

    Peat soils act as important carbon sinks, but they also release large amounts of dissolved organic carbon (DOC) to the aquatic system. The DOC export is strongly tied to the export of soluble heavy metals. The accumulation of potentially toxic substances due to anthropogenic activities, and their natural export from peat soils to the aquatic system is an important health and environmental issue. However, limited knowledge exists as to how much of these substances are mobilized, how they are mobilized in terms of flow pathways and under which hydrometeorological conditions. In this study, we report from a combined experimental and modelling effort to provide greater process understanding from a small, lead (Pb) and arsenic (As) contaminated upland peat catchment in northwestern Germany. We developed a minimally parameterized, but process-based, coupled hydrology-biogeochemistry model applied to simulate detailed hydrometric and biogeochemical data. The model was based on an initial data mining analysis, in combination with regression relationships of discharge, DOC and element export. We assessed the internal model DOC-processing based on stream-DOC hysteresis patterns and 3-hourly time step groundwater level and soil DOC data (not used for calibration as an independent model test) for two consecutive summer periods in 2013 and 2014. We found that Pb and As mobilization can be efficiently predicted from DOC transport alone, but Pb showed a significant non-linear relationship with DOC, while As was linearly related to DOC. The relatively parsimonious model (nine calibrated parameters in total) showed the importance of non-linear and rapid near-surface runoff-generation mechanisms that caused around 60% of simulated DOC load. The total load was high even though these pathways were only activated during storm events on average 30% of the monitoring time - as also shown by the experimental data. Overall, the drier period 2013 resulted in increased nonlinearity, but

  6. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  7. Interactive QR code beautification with full background image embedding

    Science.gov (United States)

    Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo

    2017-06-01

    QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.

  8. Codes with multi-level error-correcting capabilities

    Science.gov (United States)

    Lin, Mao-Chao; Lin, Shu

    1990-01-01

    In conventional channel coding, all the information symbols of a message are regarded equally significant, and hence codes are devised to provide equal protection for each information symbol against channel errors. However, in some circumstances, some information symbols in a message are more significant than the other symbols. As a result, it is desirable to devise codes with multilevel error-correcting capabilities. In this paper, block codes with multilevel error correcting capabilities, which are also known as unequal error protection (UEP) codes, are investigated. Several classes of UEP codes are constructed. One class of codes satisfies the Hamming bound on the number of parity-check symbols for systematic linear UEP codes and hence is optimal.

  9. Generating code adapted for interlinking legacy scalar code and extended vector code

    Science.gov (United States)

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  10. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  11. Diameter Perfect Lee Codes

    CERN Document Server

    Horak, Peter

    2011-01-01

    Lee codes have been intensively studied for more than 40 years. Interest in these codes has been triggered by the Golomb-Welch conjecture on the existence of perfect error-correcting Lee codes. In this paper we deal with the existence and enumeration of diameter perfect Lee codes. As main results we determine all q for which there exists a linear diameter-4 perfect Lee code of word length n over Z_{q}, and prove that for each n\\geq3 there are unaccountably many diameter-4 perfect Lee codes of word length n over Z. This is in a strict contrast with perfect error-correcting Lee codes of word length n over Z as there is a unique such code for n=3, and its is conjectured that this is always the case when 2n+1 is a prime. Diameter perfect Lee codes will be constructed by an algebraic construction that is based on a group homomorphism. This will allow us to design an efficient algorithm for their decoding.

  12. Expander chunked codes

    Science.gov (United States)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  13. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  14. QR Code: An Interactive Mobile Advertising Tool

    Directory of Open Access Journals (Sweden)

    Ela Sibel Bayrak Meydanoglu

    2013-10-01

    Full Text Available Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code. Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preferences. Marketers can use information obtained from consumers for various marketing activities such as customizing advertisement messages, determining  target audience, improving future products and services. QR codes used in marketing campaigns can provide links to specific websites in which through various tools (e.g. questionnaires, voting information about the needs and wants of customers are collected. The aim of this basic research is to illustrate the contribution of  QR codes to the realization of the advantages gained by interactive advertising.

  15. On {\\sigma}-LCD codes

    OpenAIRE

    Carlet, Claude; Mesnager, Sihem; Tang, Chunming; Qi, Yanfeng

    2017-01-01

    Linear complementary pairs (LCP) of codes play an important role in armoring implementations against side-channel attacks and fault injection attacks. One of the most common ways to construct LCP of codes is to use Euclidean linear complementary dual (LCD) codes. In this paper, we first introduce the concept of linear codes with $\\sigma$ complementary dual ($\\sigma$-LCD), which includes known Euclidean LCD codes, Hermitian LCD codes, and Galois LCD codes. As Euclidean LCD codes, $\\sigma$-LCD ...

  16. Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes

    Directory of Open Access Journals (Sweden)

    H. D. Arora

    2015-01-01

    Full Text Available In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order α and type β for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.

  17. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  18. Becoming a More Effective Research Mentor for Your Trainees: Undergraduates to Post-docs

    Science.gov (United States)

    Hooper, Eric J.; Mathieu, R.; Pfund, C.; Branchaw, J.; UW-Madison Research Mentor Training Development Team

    2010-05-01

    How do you effectively mentor individuals at different stages of their careers? Can you learn to become a more effective mentor through training? Does one size fit all? Are you ready to address the NSF's new requirement about mentoring post-docs in your next proposal? For many academics, typical answers to these questions include, "I try to make adjustments based on the trainee, but I don't have a specific plan” "Yeah, I'd better start thinking about that” and "There's training?” Scientists often are not trained for their crucial role of mentoring the next generation. The University of Wisconsin-Madison has developed, field tested, and publicly released research mentor training materials for several STEM (science, technology, engineering and mathematics) disciplines, including astronomy, to help fill this gap and improve the educational experience and ultimate success of research trainees at several career stages, from high school students to post-doctoral scholars. While initially aimed at the mentoring of undergraduate researchers at research extensive institutions, the topics are broad enough (e.g., expectations, communication, understanding, diversity, ethics, independence) to be applicable to mentoring in a wide range of project-based educational activities. Indeed, these materials have been modified, only modestly, to prepare graduate students and undergraduates to mentor high school students. In this session, we will describe the UW-Madison research mentor training seminar and illustrate how the training can be adapted and implemented. We will introduce an interactive "shopping cart” style website which allows users to obtain the materials and instructions on how to run the program at their institution. Most of the session will be devoted to an interactive implementation of elements of research mentor training using small discussion groups. Participants will experience the training seminar in practice, come face-to-face with some common mentoring

  19. Learning to Be a More Effective Research Mentor for Your Trainees: Undergraduates to Post-docs

    Science.gov (United States)

    Hooper, Eric; Mathieu, R.; Pfund, C.; Branchaw, J.; UW-Madison Research Mentor Training Development Team

    2010-01-01

    How do you effectively mentor individuals at different stages of their careers? Can you learn to become a more effective mentor through training? Does one size fit all? Are you ready to address the NSF's new requirement about mentoring post-docs in your next proposal? For many academics, typical answers to these questions include, "I try to make adjustments based on the trainee, but I don't have a specific plan” "Yeah, I'd better start thinking about that” and "There's training?” Scientists often are not trained for their crucial role of mentoring the next generation. The University of Wisconsin-Madison has developed, field tested, and publically released research mentor training materials for several STEM (science, technology, engineering and mathematics) disciplines, including astronomy, to help fill this gap and improve the educational experience and ultimate success of research trainees at several career stages, from high school students to post-doctoral scholars. While initially aimed at the mentoring of undergraduate researchers at research extensive institutions, the topics are broad enough (e.g., expectations, communication, understanding, diversity, ethics, independence) to be applicable to mentoring in a wide range of project-based educational activities. Indeed, these materials have been modified, only modestly, to prepare graduate students and undergraduates to mentor high school students. In this session, we will describe the UW-Madison research mentor training seminar and illustrate how the training can be adapted and implemented. We will introduce an interactive "shopping cart” style website which allows users to obtain the materials and instructions on how to run the program at their institution. Most of the session will be devoted to an interactive implementation of elements of research mentor training using small discussion groups. Participants will experience the training seminar in practice, come face-to-face with some common mentoring

  20. Primary Care Providers' Recommendations for Hypertension Prevention, DocStyles Survey, 2012.

    Science.gov (United States)

    Fang, Jing; Ayala, Carma; Loustalot, Fleetwood

    2015-07-01

    Healthy behaviors, including maintaining an ideal body weight, eating a healthy diet, being physically active, limiting alcohol intake, and not smoking, can help prevent hypertension. The objective of this study was to determine the prevalence of recommending these behaviors to patients by primary care providers (PCPs) and to assess what PCP characteristics, if any, were associated with making the recommendations. DocStyles 2012, a Web-based panel survey, was used to assess PCPs' demographic characteristics, health-related behaviors, practice setting, and prevalence of making selected recommendations to prevent hypertension. Logistic regression was used to calculate the odds of making all 6 recommendations, by demographic, professional, or personal health behavior characteristics. Overall, 1253 PCPs responded to the survey (537 family physicians, 464 internists, and 252 nurse practitioners). To prevent hypertension, 89.4% recommended a healthy diet, 89.9% recommended lower salt intake, 90.3% recommended maintaining a healthy weight, 69.4% recommended limiting alcohol intake, 95.1% recommended being physically active, and 90.4% recommended smoking cessation for their patients who smoked. More than half (56.1%) of PCPs recommended all 6 healthy behaviors. PCPs' demographic characteristics and practice setting were not associated with recommending all 6. PCPs who reported participating in regular physical activity (odds ratio [OR] 1.68, 95% confidence interval [CI] 1.05-2.67) and eating healthy diet (OR 1.68, 95% CI 1.11-2.56) were more likely to offer all 6 healthy behavior recommendations than those without these behaviors. Most PCPs recommended healthy behaviors to their adult patients to prevent hypertension. PCPs' own healthy behaviors were associated with their recommendations. Preventing hypertension is a multifactorial effort, and in the clinical environment, PCPs have frequent opportunities to model and promote healthy lifestyles to their patients. © The

  1. Cycling of DOC and DON by Novel Heterotrophic and Photoheterotrophic Bacteria in the Ocean: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kirchman, David L. [Univ. of Delaware, Newark, DE (United States)

    2008-12-09

    The flux of dissolved organic matter (DOM) through aquatic bacterial communities is a major process in carbon cycling in the oceans and other aquatic systems. Our work addressed the general hypothesis that the phylogenetic make-up of bacterial communities and the abundances of key types of bacteria are important factors influencing the processing of DOM in aquatic ecosystems. Since most bacteria are not easily cultivated, the phylogenetic diversity of these microbes has to be assessed using culture-independent approaches. Even if the relevant bacteria were cultivated, their activity in the lab would likely differ from that under environmental conditions. This project found variation in DOM uptake by the major bacterial groups found in coastal waters. In brief, the data suggest substantial differences among groups in the use of high and molecular weight DOM components. It also made key discoveries about the role of light in affecting this uptake especially by cyanobacteria. In the North Atlantic Ocean, for example, over half of the light-stimulated uptake was by the coccoid cyanobacterium, Prochlorococcus, with the remaining uptake due to Synechococcus and other photoheterotrophic bacteria. The project also examined in detail the degradation of one organic matter component, chitin, which is often said to be the second most abundant compound in the biosphere. The findings of this project contribute to our understanding of DOM fluxes and microbial dynamics supported by those fluxes. It is possible that these findings will lead to improvements in models of the carbon cycle that have compartments for dissolved organic carbon (DOC), the largest pool of organic carbon in the oceans.

  2. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  3. Error Correcting Codes

    Indian Academy of Sciences (India)

    be fixed to define codes over such domains). New decoding schemes that take advantage of such connections can be devised. These may soon show up in a technique called code division multiple access (CDMA) which is proposed as a basis for digital cellular communication. CDMA provides a facility for many users to ...

  4. Codes of Conduct

    Science.gov (United States)

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  5. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  6. Code Generation = A* + BURS

    NARCIS (Netherlands)

    Nymeyer, Albert; Katoen, Joost P.; Westra, Ymte; Alblas, H.; Gyimóthy, Tibor

    1996-01-01

    A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived

  7. Dress Codes for Teachers?

    Science.gov (United States)

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  8. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  9. Nuremberg code turns 60

    OpenAIRE

    Thieren, Michel; Mauron, Alexandre

    2007-01-01

    This month marks sixty years since the Nuremberg code – the basic text of modern medical ethics – was issued. The principles in this code were articulated in the context of the Nuremberg trials in 1947. We would like to use this anniversary to examine its ability to address the ethical challenges of our time.

  10. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 1. Error Correcting Codes The Hamming Codes. Priti Shankar. Series Article Volume 2 Issue 1 January ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  11. Semantic association ranking schemes for information retrieval ...

    Indian Academy of Sciences (India)

    ... relevance, multimedia, information, video, image, answer, text}. Doc 9. {google, search, engine, personalization, information, text, multimedia}. Figure 8. Term association graph on real data with 50 nodes. Table 6. User search interest value table. Session ID. Software. Algorithms. Healthcare. Sports. Movies. Music. S1.

  12. Fluorescence and DOC contents of pore waters from coastal and deep-sea sediments in the Gulf of Biscay

    Energy Technology Data Exchange (ETDEWEB)

    Sierra, M.M.D. [Universidade Federal de Santa Catarina, Florianopolis-SC (Brazil). Dept. de Quimica; Donard, O.F.X. [Pau Univ. (France). Lab. de Chimie Bio-Inorganique et Environnement; Etcheber, H. [Universite de Bordeaux I, Talence (France). Dept. de Geologie et Oceanographie; Soriano-Sierra, E.J. [Universidade Federal de Santa Catarina, Florianopolis-SC (Brazil). Nucleo de Estudos do Mar; Ewald, M. [Universite de Bordeaux I, Talence (France). Lab. de Physico Toxico-Chimie des Systemes Naturels

    2001-07-01

    Fluorescence of waters from the Gulf of Biscay was investigated. Pore waters fluoresced more intensely and exhibited red-shifted spectra relative to overlying seawaters. Also, a blue-shift was observed going from coastal to open sea sites. Results indicate that continental inputs of fluorescent material reach the sea bed at all sites studied. Organic matter (OM) modifications within sediments were also observed. In the uppermost layer (6 cm), fluorescence intensity and dissolved organic carbon (DOC) concentrations decrease, followed by a red-shift in emission spectra with increasing depth. This may reflect the increase in OM molar mass due to humification. The reverse of these trends in the deepest sub-oxic sediments might be related to the degradation of OM released from the solid phase, resulting in dissolved fluorescent material with a relative paucity of oxygen-containing functional groups. A very good correlation of DOC with fluorescence was observed in all cores. (author)

  13. The response of dissolved organic carbon (DOC) and the ecosystem carbon balance to experimental drought in a temperate shrubland

    DEFF Research Database (Denmark)

    Sowerby, A.; Emmett, B.A.; Williams, D.

    2010-01-01

    emissions of C have been predicted to result in terrestrial ecosystems becoming a net source of C by 2050. Indeed, both forms of C loss have been linked to climate-related changes, such as warming and/or changes in precipitation. In our field-based drought manipulation experiment on an upland moorland...... drainage of water from the drought-treated soils resulted in an overall decrease of 9% in total DOC export. Calculating the carbon (C) balance for the below-ground component of the ecosystem reveals that DOC represents 3% of gross C export. Previous studies at the site have demonstrated large increases....... The repeated drought treatment has thus resulted in the ecosystem switching from a net sink for C into a net source....

  14. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  15. Quantum Synchronizable Codes From Quadratic Residue Codes and Their Supercodes

    OpenAIRE

    Xie, Yixuan; Yuan, Jinhong; Fujiwara, Yuichiro

    2014-01-01

    Quantum synchronizable codes are quantum error-correcting codes designed to correct the effects of both quantum noise and block synchronization errors. While it is known that quantum synchronizable codes can be constructed from cyclic codes that satisfy special properties, only a few classes of cyclic codes have been proved to give promising quantum synchronizable codes. In this paper, using quadratic residue codes and their supercodes, we give a simple construction for quantum synchronizable...

  16. Effects of ocean acidification and hydrodynamic conditions on carbon metabolism and dissolved organic carbon (DOC fluxes in seagrass populations.

    Directory of Open Access Journals (Sweden)

    Luis G Egea

    Full Text Available Global change has been acknowledged as one of the main threats to the biosphere and its provision of ecosystem services, especially in marine ecosystems. Seagrasses play a critical ecological role in coastal ecosystems, but their responses to ocean acidification (OA and climate change are not well understood. There have been previous studies focused on the effects of OA, but the outcome of interactions with co-factors predicted to alter during climate change still needs to be addressed. For example, the impact of higher CO2 and different hydrodynamic regimes on seagrass performance remains unknown. We studied the effects of OA under different current velocities on productivity of the seagrass Zostera noltei, using changes in dissolved oxygen as a proxy for the seagrass carbon metabolism, and release of dissolved organic carbon (DOC in a four-week experiment using an open-water outdoor mesocosm. Under current pH conditions, increasing current velocity had a positive effect on productivity, but this depended on shoot density. However, this positive effect of current velocity disappeared under OA conditions. OA conditions led to a significant increase in gross production rate and respiration, suggesting that Z. noltei is carbon-limited under the current inorganic carbon concentration of seawater. In addition, an increase in non-structural carbohydrates was found, which may lead to better growing conditions and higher resilience in seagrasses subjected to environmental stress. Regarding DOC flux, a direct and positive relationship was found between current velocity and DOC release, both under current pH and OA conditions. We conclude that OA and high current velocity may lead to favourable growth scenarios for Z. noltei populations, increasing their productivity, non-structural carbohydrate concentrations and DOC release. Our results add new dimensions to predictions on how seagrass ecosystems will respond to climate change, with important

  17. Effects of ocean acidification and hydrodynamic conditions on carbon metabolism and dissolved organic carbon (DOC) fluxes in seagrass populations.

    Science.gov (United States)

    Egea, Luis G; Jiménez-Ramos, Rocío; Hernández, Ignacio; Bouma, Tjeerd J; Brun, Fernando G

    2018-01-01

    Global change has been acknowledged as one of the main threats to the biosphere and its provision of ecosystem services, especially in marine ecosystems. Seagrasses play a critical ecological role in coastal ecosystems, but their responses to ocean acidification (OA) and climate change are not well understood. There have been previous studies focused on the effects of OA, but the outcome of interactions with co-factors predicted to alter during climate change still needs to be addressed. For example, the impact of higher CO2 and different hydrodynamic regimes on seagrass performance remains unknown. We studied the effects of OA under different current velocities on productivity of the seagrass Zostera noltei, using changes in dissolved oxygen as a proxy for the seagrass carbon metabolism, and release of dissolved organic carbon (DOC) in a four-week experiment using an open-water outdoor mesocosm. Under current pH conditions, increasing current velocity had a positive effect on productivity, but this depended on shoot density. However, this positive effect of current velocity disappeared under OA conditions. OA conditions led to a significant increase in gross production rate and respiration, suggesting that Z. noltei is carbon-limited under the current inorganic carbon concentration of seawater. In addition, an increase in non-structural carbohydrates was found, which may lead to better growing conditions and higher resilience in seagrasses subjected to environmental stress. Regarding DOC flux, a direct and positive relationship was found between current velocity and DOC release, both under current pH and OA conditions. We conclude that OA and high current velocity may lead to favourable growth scenarios for Z. noltei populations, increasing their productivity, non-structural carbohydrate concentrations and DOC release. Our results add new dimensions to predictions on how seagrass ecosystems will respond to climate change, with important implications for the

  18. Influence of fresh water, nutrients and DOC in two submarine-groundwater-fed estuaries on the west of Ireland.

    Science.gov (United States)

    Smith, Aisling M; Cave, Rachel R

    2012-11-01

    Coastal fresh water sources, which discharge to the sea are expected to be directly influenced by climate change (e.g. increased frequency of extreme weather events). Sea-level rise and changes in rainfall patterns, changes in demand for drinking water and contamination caused by population and land use change, will also have an impact. Coastal waters with submarine groundwater discharge are of particular interest as this fresh water source is very poorly quantified. Two adjacent bays which host shellfish aquaculture sites along the coast of Co. Galway in the west of Ireland have been studied to establish the influence of fresh water inputs on nutrients and dissolved organic carbon (DOC) in each bay. Neither bay has riverine input and both are underlain by the karst limestone of the Burren and are susceptible to submarine groundwater discharge. Water and suspended matter samples were collected half hourly over 13 h tidal cycles over several seasons. Water samples were analysed for nutrients and DOC, while suspended matter was analysed for organic/inorganic content. Temperature and salinity measurements were recorded during each tidal station by SBE 37 MicroCAT conductivity/temperature sensors. Long-term mooring data were used to track freshwater input for Kinvara and Aughinish Bays and compare it with rainfall data. Results show that Kinvara Bay is much more heavily influenced by fresh water input than Aughinish Bay, and this is a strong source of fixed nitrogen to Kinvara Bay. Only during flood events is there a significant input of inorganic nitrogen from fresh water to Aughinish Bay, such as in late November 2009. Fresh water input does not appear to be a significant source of dissolved inorganic phosphate (DIP) to either bay, but is a source of DOC to both bays. C:N ratios of DOC/DON show a clear distinction between marine and terrestrially derived dissolved organic material. Copyright © 2012. Published by Elsevier B.V.

  19. Spatial patterns of DOC concentration and DOM optical properties in a Brazilian tropical river-wetland system

    Science.gov (United States)

    Dalmagro, Higo J.; Johnson, Mark S.; de Musis, Carlo R.; Lathuillière, Michael J.; Graesser, Jordan; Pinto-Júnior, Osvaldo B.; Couto, Eduardo G.

    2017-08-01

    The Cerrado (savanna) and Pantanal (wetland) biomes of Central Western Brazil have experienced significant development activity in recent decades, including extensive land cover conversion from natural ecosystems to agriculture and urban expansion. The Cuiabá River transects the Cerrado biome prior to inundating large areas of the Pantanal, creating one of the largest biodiversity hot spots in the world. We measured dissolved organic carbon (DOC) and the optical absorbance and fluorescence properties of dissolved organic matter (DOM) from 40 sampling locations spanning Cerrado and Pantanal biomes during wet and dry seasons. In the upper, more agricultural region of the basin, DOC concentrations were highest in the rainy season with more aromatic and humified DOM. In contrast, DOC concentrations and DOM optical properties were more uniform for the more urbanized middle region of the basin between wet and dry seasons, as well as across sample locations. In the lower region of the basin, wet season connectivity between the river and the Pantanal floodplain led to high DOC concentrations, a fourfold increase in humification index (HIX) (an indicator of DOM humification), and a 50% reduction in the spectral slope (SR). Basin-wide, wet season values for SR, HIX, and FI (fluorescence index) indicated an increasing representation of terrestrially derived DOM that was more humified. Parallel factor analysis identified two terrestrially derived components (C1 and C2) representing 77% of total fluorescing DOM (fDOM). A third, protein-like fDOM component increased markedly during the wet season within the more urban-impacted region.

  20. Raptor Codes for Use in Opportunistic Error Correction

    NARCIS (Netherlands)

    Zijnge, T.; Goseling, Jasper; Weber, Jos H.; Schiphorst, Roelof; Shao, X.; Slump, Cornelis H.

    2010-01-01

    In this paper a Raptor code is developed and applied in an opportunistic error correction (OEC) layer for Coded OFDM systems. Opportunistic error correction [3] tries to recover information when it is available with the least effort. This is achieved by using Fountain codes in a COFDM system, which

  1. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    In this paper we investigate information-theoretic image coding techniques that assign longer codes to improbable, imprecise and non-distinct intensities in the image. The variable length coding techniques when applied to cropped facial images of subjects with different facial expressions, highlight the set of low probability ...

  2. A simple numerical coding system for clinical electrocardiography

    NARCIS (Netherlands)

    Robles de Medina, E.O.; Meijler, F.L.

    1974-01-01

    A simple numerical coding system for clinical electrocardiography has been developed. This system enables the storage in coded form of the ECG analysis. The code stored on a digital magnetic tape can be used for a computer print-out of the analysis, while the information can be retrieved at any time

  3. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    Helping programmers write parallel software is an urgent problem given the popularity of multi-core architectures. Engineering compilers which automatically parallelize and vectorize code has turned out to be very challenging. Consequently, compilers are very selective with respect to the coding...... patterns they can optimize. We present an interactive approach and a tool set which leverages ad- vanced compiler analysis and optimizations while retaining programmer control over the source code and its transformation. This allows opti- mization even when programmers refrain from enabling optimizations...... to preserve accurate debug information or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and suggests workarounds which can be applied automatically...

  4. Constructed wetlands may lower inorganic nutrient inputs but enhance DOC loadings into a drinking water reservoir in North Wales.

    Science.gov (United States)

    Scholz, C; Jones, T G; West, M; Ehbair, A M S; Dunn, C; Freeman, C

    2016-09-01

    The objective of this study was to monitor a newly constructed wetland (CW) in north Wales, UK, to assess whether it contributes to an improvement in water quality (nutrient removal) of a nearby drinking water reservoir. Inflow and outflow of the Free Water Surface (FWS) CW were monitored on a weekly basis and over a period of 6 months. Physicochemical parameters including pH, conductivity and dissolved oxygen (DO) were measured, as well as nutrients and dissolved organic and inorganic carbon (DOC, DIC) concentration. The CW was seen to contribute to water quality improvement; results show that nutrient removal took place within weeks after construction. It was found that 72 % of initial nitrate (N03 (-)), 53 % of initial phosphate (PO4 (3-)) and 35 % of initial biological oxygen demand (BOD) were removed, calculated as a total over the whole sampling period. From our study, it can be concluded that while inorganic nutrients do decline in CWs, the DOC outputs increases. This may suggest that CWs represent a source for DOC. To assess the carbon in- and output a C budget was calculated.

  5. DOC and CO2-C Releases from Pristine and Drained Peat Soils in Response to Water Table Fluctuations: A Mesocosm Experiment

    Directory of Open Access Journals (Sweden)

    Merjo P. P. Laine

    2014-01-01

    Full Text Available Hydrological conditions are considered to be among the main drivers influencing the export of dissolved organic carbon (DOC from terrestrial to aquatic ecosystems, and hydrology is likely to alter due to climate change. We built a mesocosm experiment by using peat profiles from a pristine and from a drained (drained in 1978 peatland. A several-week-long low water table period followed by a high water table period, that is, a setting mimicking drought followed by flood, released relatively more DOC from pristine peat than from drained peat. From pristine peat profiles DOC was released into soil water in such quantities that the concentration of DOC remained stable despite dilution caused by added spring water to the mesocosms. In drained peat the DOC concentrations decreased during the high water table period indicating stronger dilution effect in comparison to pristine peat. At the landscape level DOC load from a drained peatland to the recipient water body may, however, increase during flooding because of high water runoff out of the peatland containing high DOC concentrations relative to the forest and agricultural areas. During the high water table period neither peat type nor water table had any clear impact on carbon dioxide (CO2-C fluxes.

  6. Prospective coding in event representation.

    Science.gov (United States)

    Schütz-Bosbach, Simone; Prinz, Wolfgang

    2007-06-01

    A perceived event such as a visual stimulus in the external world and a to-be-produced event such as an intentional action are subserved by event representations. Event representations do not only contain information about present states but also about past and future states. Here we focus on the role of representing future states in event perception and generation (i.e., prospective coding). Relevant theoretical issues and paradigms are discussed. We suggest that the predictive power of the motor system may be exploited for prospective coding not only in producing but also in perceiving events. Predicting is more advantageous than simply reacting. Perceptual prediction allows us to select appropriate responses ahead of the realization of an (anticipated) event and therefore, it is indispensable to flexibly and timely adapt to new situations and thus, successfully interact with our physical and social environment.

  7. Pyramid image codes

    Science.gov (United States)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  8. Quantum coding theorems

    Science.gov (United States)

    Holevo, A. S.

    1998-12-01

    ContentsI. IntroductionII. General considerations § 1. Quantum communication channel § 2. Entropy bound and channel capacity § 3. Formulation of the quantum coding theorem. Weak conversionIII. Proof of the direct statement of the coding theorem § 1. Channels with pure signal states § 2. Reliability function § 3. Quantum binary channel § 4. Case of arbitrary states with bounded entropyIV. c-q channels with input constraints § 1. Coding theorem § 2. Gauss channel with one degree of freedom § 3. Classical signal on quantum background noise Bibliography

  9. Controlled Dense Coding with the W State

    Science.gov (United States)

    Yang, Xue; Bai, Ming-qiang; Mo, Zhi-wen

    2017-11-01

    The average amount of information is an important factor in implementing dense coding. Based on this, we propose two schemes for controlled dense coding by using the three-qubit entangled W state as the quantum channel in this paper. In these schemes, the controller (Charlie) can adjust the local measurement angle 𝜃 to modulate the entanglement, and consequently the average amount of information transmitted from the sender (Alice) to the receiver (Bob). Although the results for the average amounts of information are the same from the different two schemes, the second scheme has advantage over the first scheme.

  10. Diagnosis code assignment: models and evaluation metrics.

    Science.gov (United States)

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie

    2014-01-01

    The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20,533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art.

  11. QR CODES IN EDUCATION AND COMMUNICATION

    Directory of Open Access Journals (Sweden)

    Gurhan DURAK

    2016-04-01

    Full Text Available Technological advances brought applications of innovations to education. Conventional education increasingly flourishes with new technologies accompanied by more learner active environments. In this continuum, there are learners preferring self-learning. Traditional learning materials yield attractive, motivating and technologically enhanced learning materials. The QR (Quick Response Codes are one of these innovations. The aim of this study is to redesign a lesson unit supported with QR Codes and to get the learner views about the redesigned material. For this purpose, the redesigned lesson unit was delivered to 15 learners in Balıkesir University in the academic year of 2013-2014. The learners were asked to study the material. The learners who had smart phones and Internet access were chosen for the study. To provide sectional diversity, three groups were created. The group learners were from Faculty of Education, Faculty of Science and Literature and Faculty of Engineering. After the semi-structured interviews were held, the learners were asked about their pre-knowledge about QR Codes, QR Codes’ contribution to learning, difficulties with using QR Codes about and design issues. Descriptive data analysis was used in the study. The findings were interpreted on the basis of Theory of Diffusion of Innovations and Theory of Uses and Gratifications. After the research, the themes found were awareness of QR Code, types of QR Codes and applications, contributions to learning, and proliferation of QR Codes. Generally, the learners participating in the study reported that they were aware of QR Codes; that they could use the QR Codes; and that using QR Codes in education was useful. They also expressed that such features as visual elements, attractiveness and direct routing had positive impact on learning. In addition, they generally mentioned that they did not have any difficulty using QR Codes; that they liked the design; and that the content should

  12. Research and Design in Unified Coding Architecture for Smart Grids

    Directory of Open Access Journals (Sweden)

    Gang Han

    2013-09-01

    Full Text Available Standardized and sharing information platform is the foundation of the Smart Grids. In order to improve the dispatching center information integration of the power grids and achieve efficient data exchange, sharing and interoperability, a unified coding architecture is proposed. The architecture includes coding management layer, coding generation layer, information models layer and application system layer. Hierarchical design makes the whole coding architecture to adapt to different application environments, different interfaces, loosely coupled requirements, which can realize the integration model management function of the power grids. The life cycle and evaluation method of survival of unified coding architecture is proposed. It can ensure the stability and availability of the coding architecture. Finally, the development direction of coding technology of the Smart Grids in future is prospected.

  13. Linear distance coding for image classification.

    Science.gov (United States)

    Wang, Zilei; Feng, Jiashi; Yan, Shuicheng; Xi, Hongsheng

    2013-02-01

    The feature coding-pooling framework is shown to perform well in image classification tasks, because it can generate discriminative and robust image representations. The unavoidable information loss incurred by feature quantization in the coding process and the undesired dependence of pooling on the image spatial layout, however, may severely limit the classification. In this paper, we propose a linear distance coding (LDC) method to capture the discriminative information lost in traditional coding methods while simultaneously alleviating the dependence of pooling on the image spatial layout. The core of the LDC lies in transforming local features of an image into more discriminative distance vectors, where the robust image-to-class distance is employed. These distance vectors are further encoded into sparse codes to capture the salient features of the image. The LDC is theoretically and experimentally shown to be complementary to the traditional coding methods, and thus their combination can achieve higher classification accuracy. We demonstrate the effectiveness of LDC on six data sets, two of each of three types (specific object, scene, and general object), i.e., Flower 102 and PFID 61, Scene 15 and Indoor 67, Caltech 101 and Caltech 256. The results show that our method generally outperforms the traditional coding methods, and achieves or is comparable to the state-of-the-art performance on these data sets.

  14. Report on a workshop concerning code validation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    The design of wind turbine components is becoming more critical as turbines become lighter and more dynamically active. Computer codes that will reliably predict turbine dynamic response are, therefore, more necessary than before. However, predicting the dynamic response of very slender rotating structures that operate in turbulent winds is not a simple matter. Even so, codes for this purpose have been developed and tested in North America and in Europe, and it is important to disseminate information on this subject. The purpose of this workshop was to allow those involved in the wind energy industry in the US to assess the progress invalidation of the codes most commonly used for structural/aero-elastic wind turbine simulation. The theme of the workshop was, ``How do we know it`s right``? This was the question that participants were encouraged to ask themselves throughout the meeting in order to avoid the temptation of presenting information in a less-than-critical atmosphere. Other questions posed at the meeting are: What is the proof that the codes used can truthfully represent the field data? At what steps were the codes tested against known solutions, or against reliable field data? How should the designer or user validate results? What computer resources are needed? How do codes being used in Europe compare with those used in the US? How does the code used affect industry certification? What can be expected in the future?

  15. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the DVC principles to camera networks. Thanks to its reversed coding paradigm M-DVC enables the exploitation of inter-camera redundancy without inter-camera communication, because the frames are encoded independently. One of the key elements in DVC is the Side Information (SI) which is an estimation...

  16. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  17. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  18. On Delayed Sequential Coding of Correlated Sources

    OpenAIRE

    Ma, Nan; Wang, Ye; Ishwar, Prakash

    2007-01-01

    Motivated by video coding applications, the problem of sequential coding of correlated sources with encoding and/or decoding frame-delays is studied. The fundamental tradeoffs between individual frame rates, individual frame distortions, and encoding/decoding frame-delays are derived in terms of a single-letter information-theoretic characterization of the rate-distortion region for general inter-frame source correlations and certain types of potentially frame specific and coupled single-lett...

  19. Influence of adsorption versus coprecipitation on the retention of rice straw-derived dissolved organic carbon and subsequent reducibility of Fe-DOC systems

    Science.gov (United States)

    Sodano, Marcella; Lerda, Cristina; Martin, Maria; Celi, Luisella; Said-Pullicino, Daniel

    2016-04-01

    The dissimilatory reduction of Fe oxides is the main organic C-consuming process in paddy soils under anoxic conditions. The contribution of Fe(III) reduction to anaerobic C mineralization depends on many factors, but most importantly on the bioavailability of labile organic matter and a reducible Fe pool as electron donors and acceptors, respectively. On the other hand, the strong affinity of these minerals for organic matter and their capability of protecting it against microbial decomposition is well known. Natural Fe oxides in these soils may therefore play a key role in determining the C source/sink functions of these agro-ecosystems. Apart from contributing to C stabilization, the interaction between Fe oxides and dissolved organic C (DOC) may influence the structure and reactivity of these natural oxides, and selectively influence the chemical properties of DOC. Indeed, Fe-DOC associations may not only reduce the availability of DOC, but may also limit the microbial reduction of Fe oxides under anoxic conditions. In fact, the accessibility of these minerals to microorganisms, extracellular enzymes, redox active shuttling compound or reducing agents may be impeded by the presence of sorbed organic matter. In soils that are regularly subjected to fluctuations in redox conditions the interaction between DOC and Fe oxides may not only involve organic coatings on mineral surfaces, but also Fe-DOC coprecipitates that form during the rapid oxidation of soil solutions containing important amounts of DOC and Fe(II). However, little is known on how these processes influence DOC retention, and the structure and subsequent reducibility of these Fe-DOC associations. We hypothesized that the nature and extent of the interaction between DOC and Fe oxides may influence the accessibility of the bioavailable Fe pool and consequently its reducibility. We tested this hypothesis by synthesizing a series of Fe-DOC systems with increasing C:Fe ratios prepared by either surface

  20. The human {alpha}2(XI) collagen gene (COL11A2): Completion of coding information, identification of the promoter sequence, and precise localization within the major histocompatibility complex reveal overlap with the KE5 gene

    Energy Technology Data Exchange (ETDEWEB)

    Lui, V.C.H.; Ng, Ling Jim; Sat, E.W.Y.; Cheah, K.S.E. [Univ. of Hong Kong (Hong Kong)

    1996-03-05

    Type XI collagen, a fibril-forming collagen, is important for the integrity and development of the skeleton because mutations in the genes encoding its consituent {alpha} chains have been found in some osteochondrodysplasias. We provide data that complete information for the coding sequence of human {alpha}2(XI) procollagen, with details of the promoter region and intron-exon organization at the 5{prime} and 3{prime} ends of the gene (COL11A2), including the transcription start and polyadenylation sites. COL11A2 is 30.5 kb with a minimum of 62 exons, differing from other reported fibrillar collagen genes because the amino propeptide is encoded by 14 not 5 to 8 exons. But exon numbers for the carboxy propeptide and 3{prime}-untranslated region are conserved. The promoter region of COL11A2 lacks a TATA box but is GC-rich with two potential SP1 binding sites. Mouse {alpha}2(XI) collagen mRNAs undergo complex alternative splicing involving three amino-terminal propeptide exons but only one of these has been reported for COL11A2. We have located these missing human exons and have identified splice signals that point to additional splice variants. We have precisely mapped COL11A2 within the major histocompatibility complex on chromosome 6. The retinoid X receptor {beta} (RXR{beta}) gene is located 1.1 kb upstream of COL11A2. KE5, previously thought to be a distinct transcribed gene sequence, was mapped within COL11A2 in the alternatively spliced region, raising the question whether KE5 and COL11A2 are separate genes. 37 refs., 7 figs., 1 tab.

  1. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  2. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  3. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    , in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded......The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access...

  4. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  5. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  6. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  7. Code de conduite

    International Development Research Centre (IDRC) Digital Library (Canada)

    irocca

    son point de vue, dans un esprit d'accueil et de respect. NOTRE CODE DE CONDUITE. Le CRDI s'engage à adopter un comportement conforme aux normes d'éthique les plus strictes dans toutes ses activités. Le Code de conduite reflète notre mission, notre philosophie en matière d'emploi et les résultats des discussions ...

  8. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    at the decoder side offering such benefits for these applications. Although there have been some advanced improvement techniques, improving the DVC coding efficiency is still challenging. The thesis addresses this challenge by proposing several iterative algorithms at different working levels, e.g. bitplane......, band, and frame levels. In order to show the information theoretic basis, theoretical foundations of DVC are introduced. The first proposed algorithm applies parallel iterative decoding using multiple LDPC decoders to utilize cross bitplane correlation. To improve Side Information (SI) generation...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...

  9. Open Coding Descriptions

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon PhD

    2016-12-01

    Full Text Available Open coding is a big source of descriptions that must be managed and controlled when doing GT research. The goal of generating a GT is to generate an emergent set of concepts and their properties that fit and work with relevancy to be integrated into a theory. To achieve this goal, the researcher begins his research with open coding, that is coding all his data in every possible way. The consequence of this open coding is a multitude of descriptions for possible concepts that often do not fit in the emerging theory. Thus in this case the researcher ends up with many irrelevant descriptions for concepts that do not apply. To dwell on descriptions for inapplicable concepts ruins the GT theory as it starts. It is hard to stop. Confusion easily sets in. Switching the study to a QDA is a simple rescue. Rigorous focusing on emerging concepts is vital before being lost in open coding descriptions. It is important, no matter how interesting the description may become. Once a core is possible, selective coding can start which will help control against being lost in multiple descriptions.

  10. Certifying Auto-Generated Flight Code

    Science.gov (United States)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  11. Special issue on network coding

    Science.gov (United States)

    Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly

    2017-12-01

    Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.

  12. Enhanced Broadcasting and Code Assignment in Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jinfang Zhang

    2008-08-01

    Full Text Available A CDMA-based mobile Ad Hoc networks face two main design challenges. One is to periodically update connectivity information, namely, neighboring nodes and the codes used by neighboring nodes. The other is to guarantee that there is no code collision in two hops' distance. This paper proposes an enhanced time-spread broadcasting schedule for connectivity information update. Based on the connectivity information, a code assignment and potential code collision resolution scheme to solve hidden/exposed nodes problem is proposed. Simulation results demonstrate the efficiency and effectiveness of the proposed schemes.

  13. 75 FR 26782 - Agency Information Collection Activities: Form I-864, Form I-864A, Form I-864EZ, and Form I-864W...

    Science.gov (United States)

    2010-05-12

    ...] [FR Doc No: 2010-11297] DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services... respond: I-864, 439,500 responses at 6 hours per response; I-864A, 215,800 responses at 1.75 hours per..., Department of Homeland Security. [FR Doc. 2010-11297 Filed 5-11-10; 8:45 am] BILLING CODE 9111-97-P ...

  14. Effect of inorganic N enrichment on basal pelagic production in boreal unproductive lakes along a gradient of DOC concentration - results after 1 year of fertilization

    Science.gov (United States)

    Deininger, Anne; Bergström, Ann-Kristin

    2013-04-01

    Input of inorganic nitrogen (N) in boreal unproductive lakes is steadily increasing due to anthropogenic deposition and usage of artificial fertilizers. N enrichment is predicted to have a major impact on the ecosystem productivity and food web structure in unproductive clear-water and humic lakes. For a long time, pelagic primary production (PP) has been mainly regarded as being phosphorus (P) limited. However, recent studies have shown that this is not true for unproductive lakes in northern Sweden, where phytoplankton is mainly N limited. Addition of inorganic N should therefore increase phytoplankton growth in these lake ecosystems. Bacterial production (BP) in the pelagic habitat, on the other hand, is usually limited by P. Nevertheless, elevated N could have a stimulating effect on BP through enhanced leakage of dissolved organic carbon (DOC) from phytoplankton following enhanced N availability and higher PP. Further, unproductive lakes vary naturally in their DOC content which affects overall nutrient- (N and P), energy- and carbon availability (light, C) for the basal producers (phytoplankton, bacteria). It is still not clear how higher inorganic N availability affects primary- and bacterial production in the pelagic in lakes with varying DOC content. We subsequently assessed this question by conducting whole-lake fertilization experiments with inorganic N additions in 6 lakes with varying DOC concentrations (2 low DOC; 2 medium DOC; 2 high DOC). For each DOC level one lake functioned as a reference and one was fertilized with N. Year 2011 was a reference year (all lakes) and 2012 was the first year of fertilization (i.e. in 3 lakes). Measurements included basal productivity such as primary production and bacteria production, lake water chemistry and physical parameters (i.e. light, temperature). The results of this study will help to develop a conceptual understanding of how increased inorganic N availability (through land use such as forestry and

  15. Rapid response of hydrological loss of DOC to water table drawdown and warming in Zoige peatland: results from a mesocosm experiment.

    Directory of Open Access Journals (Sweden)

    Xue-Dong Lou

    Full Text Available A large portion of the global carbon pool is stored in peatlands, which are sensitive to a changing environment conditions. The hydrological loss of dissolved organic carbon (DOC is believed to play a key role in determining the carbon balance in peatlands. Zoige peatland, the largest peat store in China, is experiencing climatic warming and drying as well as experiencing severe artificial drainage. Using a fully crossed factorial design, we experimentally manipulated temperature and controlled the water tables in large mesocosms containing intact peat monoliths. Specifically, we determined the impact of warming and water table position on the hydrological loss of DOC, the exported amounts, concentrations and qualities of DOC, and the discharge volume in Zoige peatland. Our results revealed that of the water table position had a greater impact on DOC export than the warming treatment, which showed no interactive effects with the water table treatment. Both DOC concentration and discharge volume were significantly increased when water table drawdown, while only the DOC concentration was significantly promoted by warming treatment. Annual DOC export was increased by 69% and 102% when the water table, controlled at 0 cm, was experimentally lowered by -10 cm and -20 cm. Increases in colored and aromatic constituents of DOC (measured by Abs(254 nm, SUVA(254 nm, Abs(400 nm, and SUVA(400 nm were observed under the lower water tables and at the higher peat temperature. Our results provide an indication of the potential impacts of climatic change and anthropogenic drainage on the carbon cycle and/or water storage in a peatland and simultaneously imply the likelihood of potential damage to downstream ecosystems. Furthermore, our results highlight the need for local protection and sustainable development, as well as suggest that more research is required to better understand the impacts of climatic change and artificial disturbances on peatland degradation.

  16. A thesaurus for a neural population code.

    Science.gov (United States)

    Ganmor, Elad; Segev, Ronen; Schneidman, Elad

    2015-09-08

    Information is carried in the brain by the joint spiking patterns of large groups of noisy, unreliable neurons. This noise limits the capacity of the neural code and determines how information can be transmitted and read-out. To accurately decode, the brain must overcome this noise and identify which patterns are semantically similar. We use models of network encoding noise to learn a thesaurus for populations of neurons in the vertebrate retina responding to artificial and natural videos, measuring the similarity between population responses to visual stimuli based on the information they carry. This thesaurus reveals that the code is organized in clusters of synonymous activity patterns that are similar in meaning but may differ considerably in their structure. This organization is highly reminiscent of the design of engineered codes. We suggest that the brain may use this structure and show how it allows accurate decoding of novel stimuli from novel spiking patterns.

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. Quantum information processing

    National Research Council Canada - National Science Library

    Leuchs, Gerd; Beth, Thomas

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 SimulationofHamiltonians... References... 1 1 1 3 5 8 10 2 Quantum Information Processing and Error Correction with Jump Codes (G. Alber, M. Mussinger...

  19. Regulatory Information By Sector

    Science.gov (United States)

    Find environmental regulatory, compliance, & enforcement information for various business, industry and government sectors, listed by NAICS code. Sectors include agriculture, automotive, petroleum manufacturing, oil & gas extraction & other manufacturing

  20. Code blue: seizures.

    Science.gov (United States)

    Hoerth, Matthew T; Drazkowski, Joseph F; Noe, Katherine H; Sirven, Joseph I

    2011-06-01

    Eyewitnesses frequently perceive seizures as life threatening. If an event occurs on the hospital premises, a "code blue" can be called which consumes considerable resources. The purpose of this study was to determine the frequency and characteristics of code blue calls for seizures and seizure mimickers. A retrospective review of a code blue log from 2001 through 2008 identified 50 seizure-like events, representing 5.3% of all codes. Twenty-eight (54%) occurred in inpatients; the other 22 (44%) events involved visitors or employees on the hospital premises. Eighty-six percent of the events were epileptic seizures. Seizure mimickers, particularly psychogenic nonepileptic seizures, were more common in the nonhospitalized group. Only five (17.9%) inpatients had a known diagnosis of epilepsy, compared with 17 (77.3%) of the nonhospitalized patients. This retrospective survey provides insights into how code blues are called on hospitalized versus nonhospitalized patients for seizure-like events. Copyright © 2011. Published by Elsevier Inc.

  1. Twisted Reed-Solomon Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Puchinger, Sven; Rosenkilde ne Nielsen, Johan

    2017-01-01

    We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code.......We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code....

  2. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  3. Manufacturer Identification Code (MID) - ACE

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  4. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  5. The First Insight into the Metabolite Profiling of Grapes from Three Vitis vinifera L. Cultivars of Two Controlled Appellation (DOC Regions

    Directory of Open Access Journals (Sweden)

    António Teixeira

    2014-03-01

    Full Text Available The characterization of the metabolites accumulated in the grapes of specific cultivars grown in different climates is of particular importance for viticulturists and enologists. In the present study, the metabolite profiling of grapes from the cultivars, Alvarinho, Arinto and Padeiro de Basto, of two Portuguese Controlled Denomination of Origin (DOC regions (Vinho Verde and Lisboa was investigated by gas chromatography-coupled time-of-flight mass spectrometry (GC-TOF-MS and an amino acid analyzer. Primary metabolites, including sugars, organic acids and amino acids, and some secondary metabolites were identified. Tartaric and malic acids and free amino acids accumulated more in grapes from vines of the DOC region of Vinho Verde than DOC Lisboa, but a principal component analysis (PCA plot showed that besides the DOC region, the grape cultivar also accounted for the variance in the relative abundance of metabolites. Grapes from the cultivar, Alvarinho, were particularly rich in malic acid and tartaric acids in both DOC regions, but sucrose accumulated more in the DOC region of Vinho Verde.

  6. A MIXED METHODS ANALYSIS OF THE EFFECT OF GOOGLE DOCS ENVIRONMENT ON EFL LEARNERS’ WRITING PERFORMANCE AND CAUSAL ATTRIBUTIONS FOR SUCCESS AND FAILURE

    Directory of Open Access Journals (Sweden)

    Zari Sadat SEYYEDREZAIE

    2016-07-01

    Full Text Available This study investigated the effect of writing process in Google Docs environment on Iranian EFL learners’ writing performance. It also examined students’ perceptions towards the effects of Google Docs and their perceived causes of success or failure in writing performance. In this regard, 48 EFL students were chosen based on their IELTs writing test scores. During the treatment, the students were taught how to write a formal five-paragraph essay in the class, but they were supposed to practice writing process and give feedback to their peers’ essays through Google Docs. At the end of the treatment phase, the participants received another sample of IELTs writing test (posttest. Moreover, 20 participants were interviewed for their perceptions regarding the causes for their success and failure and the influence of Google Docs on their writing performance. The analysis of a Paired-Sample t-test revealed that Google Docs played an effective role in improving students’ writing performance. In addition, the analysis of interview revealed that the students perceived both internal and external causes for their success and failure; but in case of failure, internal factors were cited more often than external ones. Also, it was revealed that students generally showed positive attitude towards the implication of Google Docs as a factor leading to success in their writing performance.

  7. The first insight into the metabolite profiling of grapes from three Vitis vinifera L. cultivars of two controlled appellation (DOC) regions.

    Science.gov (United States)

    Teixeira, António; Martins, Viviana; Noronha, Henrique; Eiras-Dias, José; Gerós, Hernâni

    2014-03-10

    The characterization of the metabolites accumulated in the grapes of specific cultivars grown in different climates is of particular importance for viticulturists and enologists. In the present study, the metabolite profiling of grapes from the cultivars, Alvarinho, Arinto and Padeiro de Basto, of two Portuguese Controlled Denomination of Origin (DOC) regions (Vinho Verde and Lisboa) was investigated by gas chromatography-coupled time-of-flight mass spectrometry (GC-TOF-MS) and an amino acid analyzer. Primary metabolites, including sugars, organic acids and amino acids, and some secondary metabolites were identified. Tartaric and malic acids and free amino acids accumulated more in grapes from vines of the DOC region of Vinho Verde than DOC Lisboa, but a principal component analysis (PCA) plot showed that besides the DOC region, the grape cultivar also accounted for the variance in the relative abundance of metabolites. Grapes from the cultivar, Alvarinho, were particularly rich in malic acid and tartaric acids in both DOC regions, but sucrose accumulated more in the DOC region of Vinho Verde.

  8. A DOC coagulant, gypsum treatment can simultaneously reduce As, Cd and Pb uptake by medicinal plants grown in contaminated soil.

    Science.gov (United States)

    Kim, Hyuck Soo; Seo, Byoung-Hwan; Kuppusamy, Saranya; Lee, Yong Bok; Lee, Jae-Hwang; Yang, Jae-E; Owens, Gary; Kim, Kwon-Rae

    2018-02-01

    The efficiency of gypsum, as a dissolved organic carbon (DOC) coagulator, for the simultaneous immobilization of two heavy metals (Cd and Pb) and one metalloid (As) in agricultural soils near an abandoned mining site was examined. The agricultural soil was defined as long-term contaminated as As (1540mgkg-1), Cd (55mgkg-1) and Pb (1283mgkg-1) concentrations exceeded the Korean guideline values for As (25mgkg-1), Cd (4mgkg-1), and Pb (200mgkg-1). Gypsum was incorporated into the contaminated soil at 3% (w/w). In comparison two commonly using immobilizing agents (lime and compost), together with a mixture (lime+gypsum) were also included in the pot trial for the cultivation of two medical plants (A. gigas and A. macrocephala) and to evaluate the effectiveness of gypsum on As, Cd and Pb immobilization. The results showed that even though pH change-induced immobilizing agents such as lime were more effective than gypsum at immobilizing Cd and Pb, addition of gypsum also effectively reduced heavy metal phytoavailability as indicated by decreases in the concentration of Cd and Pb in medicinal plants. Furthermore, gypsum and gypsum+ lime were also most effective in reducing As concentrations in both plants studied. This was mainly attributed to significant decreases in soil DOC (48-64%) when gypsum and gypsum+lime were applied to the soil. Consequently, it was concluded that enhanced DOC coagulation with gypsum, could be considered as a promising technique for the immobilization of both metals (Cd and Pb) and metalloids (As) in agricultural soils. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Convergence Analysis of Turbo Decoding of Serially Concatenated Block Codes and Product Codes

    Science.gov (United States)

    Krause, Amir; Sella, Assaf; Be'ery, Yair

    2005-12-01

    The geometric interpretation of turbo decoding has founded a framework, and provided tools for the analysis of parallel-concatenated codes decoding. In this paper, we extend this analytical basis for the decoding of serially concatenated codes, and focus on serially concatenated product codes (SCPC) (i.e., product codes with checks on checks). For this case, at least one of the component (i.e., rows/columns) decoders should calculate the extrinsic information not only for the information bits, but also for the check bits. We refer to such a component decoder as a serial decoding module (SDM). We extend the framework accordingly, derive the update equations for a general turbo decoder of SCPC, and the expressions for the main analysis tools: the Jacobian and stability matrices. We explore the stability of the SDM. Specifically, for high SNR, we prove that the maximal eigenvalue of the SDM's stability matrix approaches [InlineEquation not available: see fulltext.], where [InlineEquation not available: see fulltext.] is the minimum Hamming distance of the component code. Hence, for practical codes, the SDM is unstable. Further, we analyze the two turbo decoding schemes, proposed by Benedetto and Pyndiah, by deriving the corresponding update equations and by demonstrating the structure of their stability matrices for the repetition code and an SCPC code with [InlineEquation not available: see fulltext.] information bits. Simulation results for the Hamming [InlineEquation not available: see fulltext.] and Golay [InlineEquation not available: see fulltext.] codes are presented, analyzed, and compared to the theoretical results and to simulations of turbo decoding of parallel concatenation of the same codes.

  10. Bandwidth efficient coding for fading channels - Code construction and performance analysis

    Science.gov (United States)

    Schlegel, Christian; Costello, Daniel J., Jr.

    1989-01-01

    The authors apply a general method of bounding the event error probability of trellis-coded modulation schemes to fading channels and use the effective length and the minimum-squared-product distance to replace the minimum-free-squared-Euclidean distance as code design parameters for Rayleigh and Rician fading channels with a substantial multipath component. They present 8-PSK trellis codes specifically constructed for fading channels that outperform equivalent codes designed for the additive white Gaussian noise channel when v is greater than or equal to 5. For quasiregular trellis codes there exists an efficient algorithm for evaluating event error probability, and numerical results on Pe which demonstrate the importance of the effective length as a code design parameter for fading channels with or without side information have been obtained. This is consistent with the case for binary signaling, where the Hamming distance remains the best code design parameter for fading channels. The authors show that the use of Reed-Solomon block codes with expanded signal sets becomes interesting only for large values of E(s)/N(0), where they begin to outperform trellis codes.

  11. An introduction to using QR codes in scholarly journals

    Directory of Open Access Journals (Sweden)

    Jae Hwa Chang

    2014-08-01

    Full Text Available The Quick Response (QR code was first developed in 1994 by Denso Wave Incorporated, Japan. From that point on, it came into general use as an identification mark for all kinds of commercial products, advertisements, and other public announcements. In scholarly journals, the QR code is used to provide immediate direction to the journal homepage or specific content such as figures or videos. To produce a QR code and print it in the print version or upload to the web is very simple. Using a QR code producing program, an editor can add simple information to a website. After that, a QR code is produced. A QR code is very stable, such that it can be used for a long time without loss of quality. Producing and adding QR codes to a journal costs nothing; therefore, to increase the visibility of their journals, it is time for editors to add QR codes to their journals.

  12. Generalized Punctured Convolutional Codes with Unequal Error Protection

    Directory of Open Access Journals (Sweden)

    Marcelo Eduardo Pellenz

    2009-01-01

    Full Text Available We conduct a code search restricted to the recently introduced class of generalized punctured convolutional codes (GPCCs to find good unequal error protection (UEP convolutional codes for a prescribed minimal trellis complexity. The trellis complexity is taken to be the number of symbols per information bit in the “minimal” trellis module for the code. The GPCC class has been shown to possess codes with good distance properties under this decoding complexity measure. New good UEP convolutional codes and their respective effective free distances are tabulated for a variety of code rates and “minimal” trellis complexities. These codes can be used in several applications that require different levels of protection for their bits, such as the hierarchical digital transmission of video or images.

  13. Network coding at different layers in wireless networks

    CERN Document Server

    2016-01-01

    This book focuses on how to apply network coding at different layers in wireless networks – including MAC, routing, and TCP – with special focus on cognitive radio networks. It discusses how to select parameters in network coding (e.g., coding field, number of packets involved, and redundant information ration) in order to be suitable for the varying wireless environments. The book explores how to deploy network coding in MAC to improve network performance and examines joint network coding with opportunistic routing to improve the successful rate of routing. In regards to TCP and network coding, the text considers transport layer protocol working with network coding to overcome the transmission error rate, particularly with how to use the ACK feedback of TCP to enhance the efficiency of network coding. The book pertains to researchers and postgraduate students, especially whose interests are in opportunistic routing and TCP in cognitive radio networks.

  14. MGEX data analysis at CODE - current status

    Science.gov (United States)

    Prange, Lars; Dach, Rolf; Lutz, Simon; Schaer, Stefan; Jäggi, Adrian

    2013-04-01

    The Center for Orbit Determination in Europe (CODE) is contributing as an analysis center to the International GNSS Service (IGS) since many years. The processing of GPS and GLONASS data is well established in CODE's ultra-rapid, rapid, and final product lines. In 2012 the IGS started its "Multi GNSS EXperiment" (MGEX). Meanwhile (status end of 2012) about 50 new or upgraded MGEX tracking stations offer their data to the user community meeting the IGS standards (e.g., correct equipment information, calibrated antennas, RINEX data format). MGEX supports the RINEX3 data format, new signal types for the established GNSS (e.g., L5 for GPS), and new GNSS, such as Galileo, Compass, and QZSS. It is therefore well suited as a testbed for future developments in GNSS processing. CODE supports MGEX by providing a three-system orbit solution (GPS+GLONASS+Galileo) on a non-operational basis. The CODE MGEX products are freely available at ftp://cddis.gsfc.nasa.gov/gnss/products/mgex (solution ID "com" stands for CODE-MGEX). The current status of the MGEX processing at CODE will be presented focusing on the consistency of GNSS-derived results based on different frequencies/signals. An outlook about CODE's future multi-GNSS activities will be given.

  15. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  16. Managed Code Rootkits Hooking into Runtime Environments

    CERN Document Server

    Metula, Erez

    2010-01-01

    Imagine being able to change the languages for the applications that a computer is running and taking control over it. That is exactly what managed code rootkits can do when they are placed within a computer. This new type of rootkit is hiding in a place that had previously been safe from this type of attack-the application level. Code reviews do not currently look for back doors in the virtual machine (VM) where this new rootkit would be injected. An invasion of this magnitude allows an attacker to steal information on the infected computer, provide false information, and disable securit

  17. Sharing Epigraphic Information as Linked Data

    Science.gov (United States)

    Álvarez, Fernando-Luis; García-Barriocanal, Elena; Gómez-Pantoja, Joaquín-L.

    The diffusion of epigraphic data has evolved in the last years from printed catalogues to indexed digital databases shared through the Web. Recently, the open EpiDoc specifications have resulted in an XML-based schema for the interchange of ancient texts that uses XSLT to render typographic representations. However, these schemas and representation systems are still not providing a way to encode computational semantics and semantic relations between pieces of epigraphic data. This paper sketches an approach to bring these semantics into an EpiDoc based schema using the Ontology Web Language (OWL) and following the principles and methods of information sharing known as "linked data". The paper describes the general principles of the OWL mapping of the EpiDoc schema and how epigraphic data can be shared in RDF format via dereferenceable URIs that can be used to build advanced search, visualization and analysis systems.

  18. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  19. Performance Bounds on Two Concatenated, Interleaved Codes

    Science.gov (United States)

    Moision, Bruce; Dolinar, Samuel

    2010-01-01

    A method has been developed of computing bounds on the performance of a code comprised of two linear binary codes generated by two encoders serially concatenated through an interleaver. Originally intended for use in evaluating the performances of some codes proposed for deep-space communication links, the method can also be used in evaluating the performances of short-block-length codes in other applications. The method applies, more specifically, to a communication system in which following processes take place: At the transmitter, the original binary information that one seeks to transmit is first processed by an encoder into an outer code (Co) characterized by, among other things, a pair of numbers (n,k), where n (n > k)is the total number of code bits associated with k information bits and n k bits are used for correcting or at least detecting errors. Next, the outer code is processed through either a block or a convolutional interleaver. In the block interleaver, the words of the outer code are processed in blocks of I words. In the convolutional interleaver, the interleaving operation is performed bit-wise in N rows with delays that are multiples of B bits. The output of the interleaver is processed through a second encoder to obtain an inner code (Ci) characterized by (ni,ki). The output of the inner code is transmitted over an additive-white-Gaussian- noise channel characterized by a symbol signal-to-noise ratio (SNR) Es/No and a bit SNR Eb/No. At the receiver, an inner decoder generates estimates of bits. Depending on whether a block or a convolutional interleaver is used at the transmitter, the sequence of estimated bits is processed through a block or a convolutional de-interleaver, respectively, to obtain estimates of code words. Then the estimates of the code words are processed through an outer decoder, which generates estimates of the original information along with flags indicating which estimates are presumed to be correct and which are found to

  20. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.