WorldWideScience

Sample records for code gcr event-based

  1. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  2. Development of a GCR Event-based Risk Model

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  3. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  4. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    Science.gov (United States)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  5. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    Science.gov (United States)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  6. GCR Environmental Models I: Sensitivity Analysis for GCR Environments

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-01-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z>2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  7. GCR Simulator Development Status at the NASA Space Radiation Laboratory

    Science.gov (United States)

    Slaba, T. C.; Norbury, J. W.; Blattnig, S. R.

    2015-01-01

    There are large uncertainties connected to the biological response for exposure to galactic cosmic rays (GCR) on long duration deep space missions. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed with mono-energetic ions beams. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment at the NASA Space Radiation Laboratory (NSRL) are discussed. First, comparisons are made between direct simulation of the external, free space GCR field, and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, a reference environment for the GCR simulator and suitable for deep space missions is identified and described in terms of fluence and integrated dosimetric quantities. Analysis results are given to justify the use of a single reference field over a range of shielding conditions and solar activities. Third, an approach for simulating the reference field at NSRL is presented. The approach directly considers the hydrogen and helium energy spectra, and the heavier ions are collectively represented by considering the linear energy transfer (LET) spectrum. While many more aspects of the experimental setup need to be considered before final implementation of the GCR simulator, this preliminary study provides useful information that should aid the final design. Possible drawbacks of the proposed methodology are discussed and weighed

  8. GCR flux 9-day variations with LISA Pathfinder

    Science.gov (United States)

    Grimani, C.; LISA Pathfinder Collaboration; Benella, S.; Fabi, M.; Finetti, N.; Telloni, D.

    2017-05-01

    Galactic cosmic-ray (GCR) energy spectra in the heliosphere vary on the basis of the level of solar activity, the status of solar polarity and interplanetary transient magnetic structures of solar origin. A high counting rate particle detector (PD) aboard LISA Pathfinder (LPF) allows for the measurement of galactic cosmic-ray and solar energetic particle (SEP) integral fluxes at energies > 70 MeV n-1 up to 6500 counts s-1. Data are gathered with a sampling time of 15 s. A study of GCR flux depressions associated with the third harmonic of the Sun rotation period (˜ 9 days) is presented here.

  9. Secondary Cosmic Ray Particles Due to GCR Interactions in the Earth's Atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Battistoni, G.; /Milan U. /INFN, Milan; Cerutti, F.; /CERN; Fasso, A.; /SLAC; Ferrari, A.; /CERN; Garzelli, M.V.; /Milan U. /INFN, Milan; Lantz, M.; /Goteborg, ITP; Muraro, S. /Milan U. /INFN, Milan; Pinsky, L.S.; /Houston U.; Ranft, J.; /Siegen U.; Roesler, S.; /CERN; Sala, P.R.; /Milan U. /INFN, Milan

    2009-06-16

    Primary GCR interact with the Earth's atmosphere originating atmospheric showers, thus giving rise to fluxes of secondary particles in the atmosphere. Electromagnetic and hadronic interactions interplay in the production of these particles, whose detection is performed by means of complementary techniques in different energy ranges and at different depths in the atmosphere, down to the Earth's surface. Monte Carlo codes are essential calculation tools which can describe the complexity of the physics of these phenomena, thus allowing the analysis of experimental data. However, these codes are affected by important uncertainties, concerning, in particular, hadronic physics at high energy. In this paper we shall report some results concerning inclusive particle fluxes and atmospheric shower properties as obtained using the FLUKA transport and interaction code. Some emphasis will also be given to the validation of the physics models of FLUKA involved in these calculations.

  10. Secondary Cosmic Ray particles due to GCR interactions in the Earth's atmosphere

    CERN Document Server

    Battistoni, G.; Fasso, A.; Ferrari, A.; Garzelli, M.V.; Lantz, M.; Muraro, S.; Pinsky, L.S.; Ranft, J.; Roesler, S.; Sala, P.R.

    2008-01-01

    Primary GCR interact with the Earth's atmosphere originating atmospheric showers, thus giving rise to fluxes of secondary particles in the atmosphere. Electromagnetic and hadronic interactions interplay in the production of these particles, whose detection is performed by means of complementary techniques in different energy ranges and at different depths in the atmosphere, down to the Earth's surface. Monte Carlo codes are essential calculation tools which can describe the complexity of the physics of these phenomena, thus allowing the analysis of experimental data. However, these codes are affected by important uncertainties, concerning, in particular, hadronic physics at high energy. In this paper we shall report some results concerning inclusive particle fluxes and atmospheric shower properties as obtained using the FLUKA transport and interaction code. Some emphasis will also be given to the validation of the physics models of FLUKA involved in these calculations.

  11. Evaluation of abrasion of a modified drainage mixture with rubber waste crushed (GCR

    Directory of Open Access Journals (Sweden)

    Yee Wan Yung Vargas

    2017-02-01

    Conclusion: The results showed that there is a highlighted influence of mix temperature (between asphalt and GCR and compaction temperature (modified asphalt and aggregate on the behavior of the MD modified with GCR.

  12. Proton Dominance of Sub-LET Threshold GCR SEE Rate

    NARCIS (Netherlands)

    Alia, Ruben Garcia; Brugger, Markus; Ferlet-Cavrois, Veronique; Brandenburg, Sytze; Calcutt, Jordan; Cerutti, Francesco; Daly, Eamonn; Ferrari, Alfredo; Muschitiello, Michele; Santin, Giovanni; Uznanski, Slawosz; Van Goethem, Marc-Jan; Zadeh, Ali

    We apply a Monte Carlo based integral rectangular parallel-piped (IRRP) approach to evaluate the impact of heavy ion reaction products on the Galactic Cosmic Ray (GCR) Single Event Effect (SEE) rate, concluding that owing to their similar high-energy (> 100 MeV/n) SEE cross section and much larger

  13. GCR Transport in the Brain: Assessment of Self-Shielding, Columnar Damage, and Nuclear Reactions on Cell Inactivation Rates

    Science.gov (United States)

    Shavers, M. R.; Atwell, W.; Cucinotta, F. A.; Badhwar, G. D. (Technical Monitor)

    1999-01-01

    cell killing from GCR, including patterns of cell killing from single particle tracks. can provide useful information on expected differences between proton and HZE tracks and clinical experiences with photon irradiation. To model effects on cells in the brain, it is important that transport models accurately describe changes in the GCR due to interactions in the cranium and proximate tissues. We describe calculations of the attenuated GCR particle fluxes at three dose-points in the brain and associated patterns of cell killing using biophysical models. The effects of the brain self-shielding and bone-tissue interface of the skull in modulating the GCR environment are considered. For each brain dose-point, the mass distribution in the surrounding 4(pi) solid angle is characterized using the CAM model to trace 512 rays. The CAM model describes the self-shielding by converting the tissue distribution to mass-equivalent aluminum, and nominal values of spacecraft shielding is considered. Particle transport is performed with the proton, neutron, and heavy-ion transport code HZETRN with the nuclear fragmentation model QMSFRG. The distribution of cells killed along the path of individual GCR ions is modeled using in vitro cell inactivation data for cells with varying sensitivity. Monte Carlo simulations of arrays of inactivated cells are considered for protons and heavy ions and used to describe the absolute number of cell killing events of various magnitude in the brain from the GCR. Included are simulations of positions of inactivated cells from stopping heavy ions and nuclear stars produced by high-energy ions most importantly, protons and neutrons.

  14. A Tailorable Structural Composite for GCR and Albedo Neutron Protection on the Lunar Surface Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A tailorable structural composite that will provide protection from the lunar radiation environment, including GCR and albedo neutrons will be developed. This...

  15. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  16. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...... activities inside and outside an IT-system. We use event-activity diagrams to model activity. Such diagrams support the modeling of activity flow, object flow, shared events, triggering events, and interrupting events....

  17. Trehalose, glycogen and ethanol metabolism in the gcr1 mutant of Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Seker, Tamay; Hamamci, H.

    2003-01-01

    Since Gcr1p is pivotal in controlling the transcription of glycolytic enzymes and trehalose metabolism seems to be one of the control points of glycolysis, we examined trehalose and glycogen synthesis in response to 2 % glucose pulse during batch growth in gcr1 (glucose regulation-1) mutant lacking...

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  19. Early Results from the Advanced Radiation Protection Thick GCR Shielding Project

    Science.gov (United States)

    Norman, Ryan B.; Clowdsley, Martha; Slaba, Tony; Heilbronn, Lawrence; Zeitlin, Cary; Kenny, Sean; Crespo, Luis; Giesy, Daniel; Warner, James; McGirl, Natalie; hide

    2017-01-01

    The Advanced Radiation Protection Thick Galactic Cosmic Ray (GCR) Shielding Project leverages experimental and modeling approaches to validate a predicted minimum in the radiation exposure versus shielding depth curve. Preliminary results of space radiation models indicate that a minimum in the dose equivalent versus aluminum shielding thickness may exist in the 20-30 g/cm2 region. For greater shield thickness, dose equivalent increases due to secondary neutron and light particle production. This result goes against the long held belief in the space radiation shielding community that increasing shielding thickness will decrease risk to crew health. A comprehensive modeling effort was undertaken to verify the preliminary modeling results using multiple Monte Carlo and deterministic space radiation transport codes. These results verified the preliminary findings of a minimum and helped drive the design of the experimental component of the project. In first-of-their-kind experiments performed at the NASA Space Radiation Laboratory, neutrons and light ions were measured between large thicknesses of aluminum shielding. Both an upstream and a downstream shield were incorporated into the experiment to represent the radiation environment inside a spacecraft. These measurements are used to validate the Monte Carlo codes and derive uncertainty distributions for exposure estimates behind thick shielding similar to that provided by spacecraft on a Mars mission. Preliminary results for all aspects of the project will be presented.

  20. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  1. The GCR2 gene family is not required for ABA control of seed germination and early seedling development in Arabidopsis.

    Directory of Open Access Journals (Sweden)

    Jianjun Guo

    Full Text Available BACKGROUND: The plant hormone abscisic acid (ABA regulates diverse processes of plant growth and development. It has recently been proposed that GCR2 functions as a G-protein-coupled receptor (GPCR for ABA. However, the structural relationships and functionality of GCR2 have been challenged by several independent studies. A central question in this controversy is whether gcr2 mutants are insensitive to ABA, because gcr2 mutants were shown to display reduced sensitivity to ABA under one experimental condition (e.g. 22 degrees C, continuous white light with 150 micromol m(-2 s(-1 but were shown to display wild-type sensitivity under another slightly different condition (e.g. 23 degrees C, 14/10 hr photoperiod with 120 micromol m(-2 s(-1. It has been hypothesized that gcr2 appears only weakly insensitive to ABA because two other GCR2-like genes in Arabidopsis, GCL1 and GCL2, compensate for the loss of function of GCR2. PRINCIPAL FINDINGS: In order to test this hypothesis, we isolated a putative loss-of-function allele of GCL2, and then generated all possible combinations of mutations in each member of the GCR2 gene family. We found that all double mutants, including gcr2 gcl1, gcr2 gcl2, gcl1 gcl2, as well as the gcr2 gcl1 gcl2 triple mutant displayed wild-type sensitivity to ABA in seed germination and early seedling development assays, demonstrating that the GCR2 gene family is not required for ABA responses in these processes. CONCLUSION: These results provide compelling genetic evidence that GCR2 is unlikely to act as a receptor for ABA in the context of either seed germination or early seedling development.

  2. A new gcrR-deficient Streptococcus mutans mutant for replacement therapy of dental caries.

    Science.gov (United States)

    Pan, Wenting; Mao, Tiantian; Xu, Qing-an; Shao, Jin; Liu, Chang; Fan, Mingwen

    2013-01-01

    gcrR gene acts as a negative regulator related to sucrose-dependent adherence in S. mutans. It is constructive to test the potential capacity of mutans with gcrR gene deficient in bacteria replacement therapy. In this study, we constructed the mutant by homologous recombination. The morphological characteristics of biofilms were analyzed by confocal laser scanning microscopy. S. mutans UA159 and the mutant MS-gcrR-def were inoculated, respectively, or together for competitive testing in vitro and in rat model. Adhesion assay showed that the adhesion ability of the mutant increased relative to the wild type, especially in the early stage. MS-gcrR-def out-competed S. mutans UA159 in vitro biofilm, and correspondingly coinfection displayed significantly fewer caries in vivo. The former possessed both a lower level of acid production and a stronger colonization potential than S. mutans UA159. These findings demonstrate that MS-gcrR-def appears to be a good candidate for replacement therapy.

  3. A New gcrR-Deficient Streptococcus mutans Mutant for Replacement Therapy of Dental Caries

    Directory of Open Access Journals (Sweden)

    Wenting Pan

    2013-01-01

    Full Text Available Background. gcrR gene acts as a negative regulator related to sucrose-dependent adherence in S. mutans. It is constructive to test the potential capacity of mutans with gcrR gene deficient in bacteria replacement therapy. Methods. In this study, we constructed the mutant by homologous recombination. The morphological characteristics of biofilms were analyzed by confocal laser scanning microscopy. S. mutans UA159 and the mutant MS-gcrR-def were inoculated, respectively, or together for competitive testing in vitro and in rat model. Results. Adhesion assay showed that the adhesion ability of the mutant increased relative to the wild type, especially in the early stage. MS-gcrR-def out-competed S. mutans UA159 in vitro biofilm, and correspondingly coinfection displayed significantly fewer caries in vivo. The former possessed both a lower level of acid production and a stronger colonization potential than S. mutans UA159. Conclusion. These findings demonstrate that MS-gcrR-def appears to be a good candidate for replacement therapy.

  4. Evaluation of SPE and GCR Radiation Effects in Inflatable, Space Suit and Composite Habitat Materials Project

    Science.gov (United States)

    Waller, Jess M.; Nichols, Charles

    2016-01-01

    The radiation resistance of polymeric and composite materials to space radiation is currently based on irradiating materials with Co-60 gamma-radiation to the equivalent total ionizing dose (TID) expected during mission. This is an approximation since gamma-radiation is not truly representative of the particle species; namely, Solar Particle Event (SPE) protons and Galactic Cosmic Ray (GCR) nucleons, encountered in space. In general, the SPE and GCR particle energies are much higher than Co-60 gamma-ray photons, and since the particles have mass, there is a displacement effect due to nuclear collisions between the particle species and the target material. This effort specifically bridges the gap between estimated service lifetimes based on decades old Co-60 gamma-radiation data, and newer assessments of what the service lifetimes actually are based on irradiation with particle species that are more representative of the space radiation environment.

  5. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    Science.gov (United States)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    acute exposures of the mixed field beams used for the experiments. The chromosomes were simulated by a polymer random walk algorithm with restrictions to their respective domains in the nucleus [1]. The stochastic dose to the nucleus was calculated with the code RITRACKS [2]. Irradiation of a target volume by a mixed field of ions was implemented within RITRACKs, and the fields of ions can be delivered over specific periods of time, allowing the simulation of dose-rate effects. Similarly, particles of various types and energies extracted from a pre-calculated spectra of galactic cosmic rays (GCR) can be used in RITRACKS. The number and spatial location of DSBs (DNA double-strand breaks) were calculated in BDSTRACKS using the simulated chromosomes and local (voxel) dose. Assuming that DSBs led to chromosome breaks, and simulating the rejoining of damaged chromosomes occurring during repair, BDSTRACKS produces the yield of various types of chromosome aberrations as a function of time (only final yields are presented). A comparison between experimental and simulation results will be shown.

  6. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample th...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine......Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...

  7. Miniaturized Hollow-Waveguide Gas Correlation Radiometer (GCR) for Trace Gas Detection in the Martian Atmosphere

    Science.gov (United States)

    Wilson, Emily L.; Georgieva, E. M.; Melroy, H. R.

    2012-01-01

    Gas correlation radiometry (GCR) has been shown to be a sensitive and versatile method for detecting trace gases in Earth's atmosphere. Here, we present a miniaturized and simplified version of this instrument capable of mapping multiple trace gases and identifying active regions on the Mars surface. Reduction of the size and mass of the GCR instrument has been achieved by implementing a lightweight, 1 mm inner diameter hollow-core optical fiber (hollow waveguide) for the gas correlation cell. Based on a comparison with an Earth orbiting CO2 gas correlation instrument, replacement of the 10 meter mUltipass cell with hollow waveguide of equivalent pathlength reduces the cell mass from approx 150 kg to approx 0.5 kg, and reduces the volume from 1.9 m x 1.3 m x 0.86 m to a small bundle of fiber coils approximately I meter in diameter by 0.05 m in height (mass and volume reductions of >99%). This modular instrument technique can be expanded to include measurements of additional species of interest including nitrous oxide (N2O), hydrogen sulfide (H2S), methanol (CH3OH), and sulfur dioxide (SO2), as well as carbon dioxide (CO2) for a simultaneous measure of mass balance.

  8. Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6

    Science.gov (United States)

    Ratliff, Hunter N.; Smith, Michael B. R.; Heilbronn, Lawrence

    2017-08-01

    The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 μGy/day while RAD measured 233 μGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 μSv/day while RAD reported 710 μSv/day.

  9. Relation of hardness with FWHM and residual stress of GCr15 steel after shot peening

    Science.gov (United States)

    Fu, Peng; Chu, Ruiqing; Xu, Zhijun; Ding, Guanjun; Jiang, Chuanhai

    2018-02-01

    The variations of XRD full width at half maximum (FWHM), residual stress and hardness for the surface of GCr15 steel after triple shot peening (TSP) as a function of annealing time and temperature are studied. The results show that with the increase of annealing temperature and time, hardness and FWHM increase gradually while compressive residual stress (CRS) decreases gradually. CRS and micro- structure work together on the hardness values, and the micro-structure is the most important factor for hardness. According to establishing the quantitive relationship of hardness with FWHM and CRS, the value of hardness can be calculated; a new type of noncontact and nondestructive hardness testing can be realized by XRD method.

  10. Results of Simulated Galactic Cosmic Radiation (GCR) and Solar Particle Events (SPE) on Spectra Restraint Fabric

    Science.gov (United States)

    Peters, Benjamin; Hussain, Sarosh; Waller, Jess

    2017-01-01

    Spectra or similar Ultra-high-molecular-weight polyethylene (UHMWPE) fabric is the likely choice for future structural space suit restraint materials due to its high strength-to-weight ratio, abrasion resistance, and dimensional stability. During long duration space missions, space suits will be subjected to significant amounts of high-energy radiation from several different sources. To insure that pressure garment designs properly account for effects of radiation, it is important to characterize the mechanical changes to structural materials after they have been irradiated. White Sands Test Facility (WSFTF) collaborated with the Crew and Thermal Systems Division at the Johnson Space Center (JSC) to irradiate and test various space suit materials by examining their tensile properties through blunt probe puncture testing and single fiber tensile testing after the materials had been dosed at various levels of simulated GCR and SPE Iron and Proton beams at Brookhaven National Laboratories. The dosages were chosen based on a simulation developed by the Structural Engineering Division at JSC for the expected radiation dosages seen by space suit softgoods seen on a Mars reference mission. Spectra fabric tested in the effort saw equivalent dosages at 2x, 10x, and 20x the predicted dose as well as a simulated 50 year exposure to examine the range of effects on the material and examine whether any degradation due to GCR would be present if the suit softgoods were stored in deep space for a long period of time. This paper presents the results of this work and outlines the impact on space suit pressure garment design for long duration deep space missions.

  11. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  12. Precipitation and Evolution Behavior of Carbide During Heat Treatments of GCr15 Bearing Steel

    Directory of Open Access Journals (Sweden)

    MA Chao

    2017-06-01

    Full Text Available The evolution behavior of carbides in GCr15 bearing steels after spheroidization annealing, austenitization quenching and low temperature tempering was investigated by the method of quantitative metallography. Numerical simulations on the dissolution kinetics of carbide size and composition during austenitization were performed by ThermoCalc software. The results indicate that the carbide particles formed after spheroidization annealing have a multimodal distribution whilst their size distribution changes to have a single peak after austenitization and tempering, and Cr content increases slightly after austenitization; the carbide particles appear to have larger size with higher Cr content; C rich austenite is formed during austenitization through solid solution by carbides after spheroidization annealing, and then high carbon martensite is formed after quenching and results in the high hardness; Cr atoms can partition from austenite to carbide during the dissolution of carbide, lead to the increasing Cr content of rest carbide particles; the numerical simulations indicate that the carbide particles with the diameter of 200nm cannot completely be dissolved during austenitization even if its Cr content is close to the nominal Cr content of steel, and the undissolved ones may affect the precipitation of carbides during the subsequent tempering.

  13. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  14. Event-based prospective memory performance in autism spectrum disorder

    NARCIS (Netherlands)

    Altgassen, A.M.; Schmitz-Hübsch, M.; Kliegel, M.

    2010-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with

  15. Towards an event-based corpuscular model for optical phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Jin, F.; Michielsen, K.; Roychoudhuri, C; Khrennikov, AY; Kracklauer, AF

    2011-01-01

    We discuss an event-based corpuscular model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory through a series of cause-and-effect processes, starting with the emission and ending with the

  16. Training Team Problem Solving Skills: An Event-Based Approach.

    Science.gov (United States)

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  17. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  18. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  19. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  20. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  1. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  2. Event-based prospective memory performance in autism spectrum disorder.

    Science.gov (United States)

    Altgassen, Mareike; Schmitz-Hübsch, Maren; Kliegel, Matthias

    2010-03-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to remember to respond to a cue-event. Everyday planning performance was assessed with proxy ratings. Although parents of the autism group rated their children's everyday performance as significantly poorer than controls' parents, no group differences were found in event-based prospective memory. Nevertheless, individual differences in laboratory-based and everyday performances were related. Clinical implications of these findings are discussed.

  3. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  4. Event-based prospective memory performance in autism spectrum disorder

    OpenAIRE

    Altgassen, Mareike; Schmitz-H?bsch, Maren; Kliegel, Matthias

    2009-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to ...

  5. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-31

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization.

  7. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  8. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  9. A joint renewal process used to model event based data

    National Research Council Canada - National Science Library

    Mergenthaler, Wolfgang; Jaroszewski, Daniel; Feller, Sebastian; Laumann, Larissa

    2016-01-01

    .... Event data, herein defined as a collection of triples containing a time stamp, a failure code and eventually a descriptive text, can best be evaluated by using the paradigm of joint renewal processes...

  10. Event-Based User Classification in Weibo Media

    Science.gov (United States)

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  11. Event-based internet biosurveillance: relation to epidemiological observation

    Directory of Open Access Journals (Sweden)

    Nelson Noele P

    2012-06-01

    Full Text Available Abstract Background The World Health Organization (WHO collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1 2009 (April 2009 – January 2010 from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p  Conclusion Confirmed pandemic (H1N1 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events.

  12. Investigations on femtosecond laser modified micro-textured surface with anti-friction property on bearing steel GCr15

    Science.gov (United States)

    Yang, Lijun; Ding, Ye; Cheng, Bai; He, Jiangtao; Wang, Genwang; Wang, Yang

    2018-03-01

    This work puts forward femtosecond laser modification of micro-textured surface on bearing steel GCr15 in order to reduce frictional wear and enhance load capacity during its application. Multi pulses femtosecond laser ablation experiments are established for the confirmation of laser spot radius as well as single pulse threshold fluence and pulse incubation coefficient of bulk material. Analytical models are set up in combination with hydrodynamics lubrication theory. Corresponding simulations are carried out on to explore influences of surface and cross sectional morphology of textures on hydrodynamics lubrication effect based on Navier-Stokes (N-S) equation. Technological experiments focus on the impacts of femtosecond laser machining variables, like scanning times, scanning velocity, pulse frequency and scanning gap on morphology of grooves as well as realization of optimized textures proposed by simulations, mechanisms of which are analyzed from multiple perspectives. Results of unidirectional rotating friction tests suggest that spherical texture with depth-to-width ratio of 0.2 can significantly improve tribological properties at low loading and velocity condition comparing with un-textured and other textured surfaces, which also verifies the accuracy of simulations and feasibility of femtosecond laser in modification of micro-textured surface.

  13. Spatial gradients of GCR protons in the inner heliosphere derived from Ulysses COSPIN/KET and PAMELA measurements

    Science.gov (United States)

    Gieseler, J.; Heber, B.

    2016-05-01

    Context. During the transition from solar cycle 23 to 24 from 2006 to 2009, the Sun was in an unusual solar minimum with very low activity over a long period. These exceptional conditions included a very low interplanetary magnetic field (IMF) strength and a high tilt angle, which both play an important role in the modulation of galactic cosmic rays (GCR) in the heliosphere. Thus, the radial and latitudinal gradients of GCRs are very much expected to depend not only on the solar magnetic epoch, but also on the overall modulation level. Aims: We determine the non-local radial and the latitudinal gradients of protons in the rigidity range from ~0.45 to 2 GV. Methods: This was accomplished by using data from the satellite-borne experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) at Earth and the Kiel Electron Telescope (KET) onboard Ulysses on its highly inclined Keplerian orbit around the Sun with the aphelion at Jupiter's orbit. Results: In comparison to the previous A> 0 solar magnetic epoch, we find that the absolute value of the latitudinal gradient is lower at higher and higher at lower rigidities. This energy dependence is therefore a crucial test for models that describe the cosmic ray transport in the inner heliosphere.

  14. In vitro Manganese-Dependent Cross-Talk between Streptococcus mutans VicK and GcrR: Implications for Overlapping Stress Response Pathways

    Science.gov (United States)

    Downey, Jennifer S.; Mashburn-Warren, Lauren; Ayala, Eduardo A.; Senadheera, Dilani B.; Hendrickson, Whitney K.; McCall, Lathan W.; Sweet, Julie G.; Cvitkovitch, Dennis G.; Spatafora, Grace A.; Goodman, Steven D.

    2014-01-01

    Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR) to modulate stress response pathways. PMID:25536343

  15. In vitro manganese-dependent cross-talk between Streptococcus mutans VicK and GcrR: implications for overlapping stress response pathways.

    Directory of Open Access Journals (Sweden)

    Jennifer S Downey

    Full Text Available Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR to modulate stress response pathways.

  16. GCR Dose Rate Observed in Lunar Orbit During the Transition from Solar Cycle 23 to Cycle 24

    Science.gov (United States)

    Golightly, M. J.; Schwadron, N. A.; Spence, H. E.; Wilson, J. K.; Case, A.; Townsend, L.; Kasper, J. C.; Blake, J.; Looper, M. D.; Mazur, J.

    2010-12-01

    -12-24. In this analysis period the peak dose rates measured by the CRaTER detectors occurred during the period 2009-12-31 to 2010-01-07 followed by the onset of gradual declines. During this period the Thule neutron monitor count rate, a proxy for moderate-to-high-energy GCR proton flux at the top of the Earth’s atmosphere, peaked during the period 2009-10-01 to 2009-12-31 before undergoing a sharp decrease through the remainder of the analysis period. Assuming the peak in the Thule neutron monitor count rate marks the deepest point of solar cycle 23 and the beginning of an unabated increase in solar activity with the onset of cycle 24, these CRaTER measurements may represent the maximum GCR dose rates directly measured since the inception of space-based measurements more than 50 years ago.

  17. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  18. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  19. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  20. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  1. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  2. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  3. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  4. Hurricane! An Event-Based Science Module. Student Edition. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  5. Hurricane!: An Event-Based Science Module. Teacher's Guide. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about problems with hurricanes and scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning,…

  6. Fraud! An Event-Based Science Module. Student Edition. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  7. Fraud! An Event-Based Science Module. Teacher's Guide. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  8. Qualitative Event-based Diagnosis with Possible Conflicts Applied to Spacecraft Power Distribution Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based diagnosis enables efficient and safe operation of engineered systems. In this paper, we describe two algorithms based on a qualitative event-based fault...

  9. Pymote: High Level Python Library for Event-Based Simulation and Evaluation of Distributed Algorithms

    National Research Council Canada - National Science Library

    Arbula, Damir; Lenac, Kristijan

    2013-01-01

    .... Simulation is a fundamental part of distributed algorithm design and evaluation process. In this paper, we present a library for event-based simulation and evaluation of distributed algorithms...

  10. Prospective memory while driving: comparison of time- and event-based intentions.

    Science.gov (United States)

    Trawley, Steven L; Stephens, Amanda N; Rendell, Peter G; Groeger, John A

    2017-06-01

    Prospective memories can divert attentional resources from ongoing activities. However, it is unclear whether these effects and the theoretical accounts that seek to explain them will generalise to a complex real-world task such as driving. Twenty-four participants drove two simulated routes while maintaining a fixed headway with a lead vehicle. Drivers were given either event-based (e.g. arriving at a filling station) or time-based errands (e.g. on-board clock shows 3:30). In contrast to the predominant view in the literature which suggests time-based tasks are more demanding, drivers given event-based errands showed greater difficulty in mirroring lead vehicle speed changes compared to the time-based group. Results suggest that common everyday secondary tasks, such as scouting the roadside for a bank, may have a detrimental impact on driving performance. The additional finding that this cost was only evident with the event-based task highlights a potential area of both theoretical and practical interest. Practitioner Summary: Drivers were given either time- or event-based errands whilst engaged in a simulated drive. We examined the effect of errands on an ongoing vehicle follow task. In contrast to previous non-driving studies, event-based errands are more disruptive. Common everyday errands may have a detrimental impact on driving performance.

  11. Efficiency of Event-Based Sampling According to Error Energy Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2010-03-01

    Full Text Available The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control. Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  12. Efficiency of event-based sampling according to error energy criterion.

    Science.gov (United States)

    Miskowicz, Marek

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  13. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  14. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    Science.gov (United States)

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an

  15. Cyclone Codes

    OpenAIRE

    Schindelhauer, Christian; Jakoby, Andreas; Köhler, Sven

    2016-01-01

    We introduce Cyclone codes which are rateless erasure resilient codes. They combine Pair codes with Luby Transform (LT) codes by computing a code symbol from a random set of data symbols using bitwise XOR and cyclic shift operations. The number of data symbols is chosen according to the Robust Soliton distribution. XOR and cyclic shift operations establish a unitary commutative ring if data symbols have a length of $p-1$ bits, for some prime number $p$. We consider the graph given by code sym...

  16. Event-Based Proof of the Mutual Exclusion Property of Peterson’s Algorithm

    Directory of Open Access Journals (Sweden)

    Ivanov Ievgen

    2015-12-01

    Full Text Available Proving properties of distributed algorithms is still a highly challenging problem and various approaches that have been proposed to tackle it [1] can be roughly divided into state-based and event-based proofs. Informally speaking, state-based approaches define the behavior of a distributed algorithm as a set of sequences of memory states during its executions, while event-based approaches treat the behaviors by means of events which are produced by the executions of an algorithm. Of course, combined approaches are also possible.

  17. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  18. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  19. code {poems}

    Directory of Open Access Journals (Sweden)

    Ishac Bertran

    2012-08-01

    Full Text Available "Exploring the potential of code to communicate at the level of poetry," the code­ {poems} project solicited submissions from code­writers in response to the notion of a poem, written in a software language which is semantically valid. These selections reveal the inner workings, constitutive elements, and styles of both a particular software and its authors.

  20. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    Science.gov (United States)

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  2. Event-based prospective memory in depression: The impact of cue focality

    NARCIS (Netherlands)

    Altgassen, A.M.; Kliegel, M.; Martin, M.

    2009-01-01

    This study is the first to compare event-based prospective memory performance in individuals with depression and healthy controls. The degree to which self-initiated processing is required to perform the prospective memory task was varied. Twenty-eight individuals with depression and 32 healthy

  3. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...

  4. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  5. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  6. Solar Energetic Particles (SEP) and Galactic Cosmic Rays (GCR) as tracers of solar wind conditions near Saturn: Event lists and applications

    Science.gov (United States)

    Roussos, E.; Jackman, C. M.; Thomsen, M. F.; Kurth, W. S.; Badman, S. V.; Paranicas, C.; Kollmann, P.; Krupp, N.; Bučík, R.; Mitchell, D. G.; Krimigis, S. M.; Hamilton, D. C.; Radioti, A.

    2018-01-01

    The lack of an upstream solar wind monitor poses a major challenge to any study that investigates the influence of the solar wind on the configuration and the dynamics of Saturn's magnetosphere. Here we show how Cassini MIMI/LEMMS observations of Solar Energetic Particle (SEP) and Galactic Cosmic Ray (GCR) transients, that are both linked to energetic processes in the heliosphere such us Interplanetary Coronal Mass Ejections (ICMEs) and Corotating Interaction Regions (CIRs), can be used to trace enhanced solar wind conditions at Saturn's distance. SEP protons can be easily distinguished from magnetospheric ions, particularly at the MeV energy range. Many SEPs are also accompanied by strong GCR Forbush Decreases. GCRs are detectable as a low count-rate noise signal in a large number of LEMMS channels. As SEPs and GCRs can easily penetrate into the outer and middle magnetosphere, they can be monitored continuously, even when Cassini is not situated in the solar wind. A survey of the MIMI/LEMMS dataset between 2004 and 2016 resulted in the identification of 46 SEP events. Most events last more than two weeks and have their lowest occurrence rate around the extended solar minimum between 2008 and 2010, suggesting that they are associated to ICMEs rather than CIRs, which are the main source of activity during the declining phase and the minimum of the solar cycle. We also list of 17 time periods ( > 50 days each) where GCRs show a clear solar periodicity ( ∼ 13 or 26 days). The 13-day period that derives from two CIRs per solar rotation dominates over the 26-day period in only one of the 17 cases catalogued. This interval belongs to the second half of 2008 when expansions of Saturn's electron radiation belts were previously reported to show a similar periodicity. That observation not only links the variability of Saturn's electron belts to solar wind processes, but also indicates that the source of the observed periodicity in GCRs may be local. In this case GCR

  7. Benchmark problem for International Atomic Energy Agency (IAEA) coordinated research program (CRP) on gas-cooled reactor (GCR) afterheat Removal

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Shoji; Shiina, Yasuaki; Inagaki, Yoshiyuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Hishida, Makoto; Sudo, Yukio

    1997-12-31

    In IAEA CRP on `Heat Transport and Afterheat Removal for GCRs under Accident Conditions`, experimental data of the JAERI`s cooling panel test apparatus were selected as benchmark problems to verify the validity of computational codes for design and evaluation of the performance of heat transfer and temperature distribution of components in the cooling panel system of the HTGR. The test apparatus was composed of a pressure vessel (P.V) with 1m in diameter and 3m in height, containing heaters with the maximum heating rate of 100kW simulating decay heat, cooling panels surrounding the P.V and the reactor cavity occupied by air at the atmospheric pressure. Seven experimental data were established as benchmark problems to evaluate the effect of natural convection of superheated gas on temperature distribution of the P.V and the performance of heat transfer of both the water and the air cooling panel systems. The analytical code THANPACST2 was applied to analyze two benchmark problems to verify the validity of the analytical methods and models proposed. Under the conditions at helium gas pressure of 0.73MPa and temperature of 210degC in the P.V of the water cooling panel system, temperatures of the P.V were well estimated within the errors of -14% to +27% compared with the experimental data. The analyses indicated that the heat transferred to the cooling panel was 11.4% less than the experimental value and the heat transferred by thermal radiation was 74.4% of the total heat input. (author)

  8. Sharing code.

    Science.gov (United States)

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  9. Analog Coding.

    Science.gov (United States)

    CODING, ANALOG SYSTEMS), INFORMATION THEORY, DATA TRANSMISSION SYSTEMS , TRANSMITTER RECEIVERS, WHITE NOISE, PROBABILITY, ERRORS, PROBABILITY DENSITY FUNCTIONS, DIFFERENTIAL EQUATIONS, SET THEORY, COMPUTER PROGRAMS

  10. Decentralized Event-Based Communication Strategy on Leader-Follower Consensus Control

    Directory of Open Access Journals (Sweden)

    Duosi Xie

    2016-01-01

    Full Text Available This paper addresses the leader-follower consensus problem of networked systems by using a decentralized event-based control strategy. The event-based control strategy makes the controllers of agents update at aperiodic event instants. Two decentralized event functions are designed to generate these event instants. In particular, the second event function only uses its own information and the neighbors’ states at their latest event instants. By using this event function, no continuous communication among followers is required. As the followers only communicate at these discrete event instants, this strategy is able to save communication and to reduce channel occupation. It is analytically shown that the leader-follower networked system is able to reach consensus by utilizing the proposed control strategy. Simulation examples are shown to illustrate effectiveness of the proposed control strategy.

  11. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  12. An event-based neurobiological recognition system with orientation detector for objects in multiple orientations

    Directory of Open Access Journals (Sweden)

    Hanyu Wang

    2016-11-01

    Full Text Available A new multiple orientation event-based neurobiological recognition system is proposed by integrating recognition and tracking function in this paper, which is used for asynchronous address-event representation (AER image sensors. The characteristic of this system has been enriched to recognize the objects in multiple orientations with only training samples moving in a single orientation. The system extracts multi-scale and multi-orientation line features inspired by models of the primate visual cortex. An orientation detector based on modified Gaussian blob tracking algorithm is introduced for object tracking and orientation detection. The orientation detector and feature extraction block work in simultaneous mode, without any increase in categorization time. An addresses lookup table (addresses LUT is also presented to adjust the feature maps by addresses mapping and reordering, and they are categorized in the trained spiking neural network. This recognition system is evaluated with the MNIST dataset which have played important roles in the development of computer vision, and the accuracy is increase owing to the use of both ON and OFF events. AER data acquired by a DVS are also tested on the system, such as moving digits, pokers, and vehicles. The experimental results show that the proposed system can realize event-based multi-orientation recognition.The work presented in this paper makes a number of contributions to the event-based vision processing system for multi-orientation object recognition. It develops a new tracking-recognition architecture to feedforward categorization system and an address reorder approach to classify multi-orientation objects using event-based data. It provides a new way to recognize multiple orientation objects with only samples in single orientation.

  13. Event-based media processing and analysis: A survey of the literature

    OpenAIRE

    Tzelepis, Christos; Ma, Zhigang; MEZARIS, Vasileios; Ionescu, Bogdan; Kompatsiaris, Ioannis; Boato, Giulia; Sebe, Nicu; Yan, Shuicheng

    2016-01-01

    Research on event-based processing and analysis of media is receiving an increasing attention from the scientific community due to its relevance for an abundance of applications, from consumer video management and video surveillance to lifelogging and social media. Events have the ability to semantically encode relationships of different informational modalities, such as visual-audio-text, time, involved agents and objects, with the spatio-temporal component of events being a key feature for ...

  14. The Link Between Alcohol Use and Aggression Toward Sexual Minorities: An Event-Based Analysis

    OpenAIRE

    Parrott, Dominic J.; Gallagher, Kathryn E.; Vincent, Wilson; Bakeman, Roger

    2010-01-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men’s alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18–30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexua...

  15. Valenced Cues and Contexts Have Different Effects on Event-Based Prospective Memory

    OpenAIRE

    Peter Graf; Martin Yu

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when ...

  16. Detection of vulnerable relays and sensitive controllers under cascading events based on performance indices

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Hu, Yanting

    2014-01-01

    ) based detection strategy is proposed to identify the vulnerable relays and sensitive controllers under the overloading situation during cascading events. Based on the impedance margin sensitivity, diverse performance indices are proposed to help improving this detection. A study case of voltage...... instability induced cascaded blackout built in real time digital simulator (RTDS) will be used to demonstrate the proposed strategy. The simulation results indicate this strategy can effectively detect the vulnerable relays and sensitive controllers under overloading situations....

  17. Personalized Event-Based Surveillance and Alerting Support for the Assessment of Risk

    OpenAIRE

    Stewar, Avaré; Lage, Ricardo; Diaz-Aviles, Ernesto; Dolog, Peter

    2011-01-01

    In a typical Event-Based Surveillance setting, a stream of web documents is continuously monitored for disease reporting. A structured representation of the disease reporting events is extracted from the raw text, and the events are then aggregated to produce signals, which are intended to represent early warnings against potential public health threats. To public health officials, these warnings represent an overwhelming list of "one-size-fits-all" information for risk assessment. To reduce ...

  18. Divergence coding for convolutional codes

    Directory of Open Access Journals (Sweden)

    Valery Zolotarev

    2017-01-01

    Full Text Available In the paper we propose a new coding/decoding on the divergence principle. A new divergent multithreshold decoder (MTD for convolutional self-orthogonal codes contains two threshold elements. The second threshold element decodes the code with the code distance one greater than for the first threshold element. Errorcorrecting possibility of the new MTD modification have been higher than traditional MTD. Simulation results show that the performance of the divergent schemes allow to approach area of its effective work to channel capacity approximately on 0,5 dB. Note that we include the enough effective Viterbi decoder instead of the first threshold element, the divergence principle can reach more. Index Terms — error-correcting coding, convolutional code, decoder, multithreshold decoder, Viterbi algorithm.

  19. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    Science.gov (United States)

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  20. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  1. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech......; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...

  2. openQ*D simulation code for QCD+QED arXiv

    CERN Document Server

    Campos, Isabel; Hansen, Martin; Marinković, Marina Krstić; Patella, Agostino; Ramos, Alberto; Tantalo, Nazario

    The openQ*D code for the simulation of QCD+QED with C$^\\star$ boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion action, the multilevel integrator, the 4th order OMF integrator, the SSE/AVX intrinsics, etc. The photon field is treated as fully dynamical and C$^\\star$ boundary conditions can be chosen in the spatial directions. We discuss the main features of openQ*D, and we show basic test results and performance analysis. An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/ .

  3. The Effects of Age and Cue-Action Reminders on Event-Based Prospective Memory Performance in Preschoolers

    Science.gov (United States)

    Kliegel, Matthias; Jager, Theodor

    2007-01-01

    The present study investigated event-based prospective memory in five age groups of preschoolers (i.e., 2-, 3-, 4-, 5-, and 6-year-olds). Applying a laboratory-controlled prospective memory procedure, the data showed that event-based prospective memory performance improves across the preschool years, at least between 3 and 6 years of age. However,…

  4. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  5. Coding labour

    National Research Council Canada - National Science Library

    McCosker, Anthony; Milne, Esther

    2014-01-01

    ... software. Code encompasses the laws that regulate human affairs and the operation of capital, behavioural mores and accepted ways of acting, but it also defines the building blocks of life as DNA...

  6. A differential deficit in time- versus event-based prospective memory in Parkinson's disease.

    Science.gov (United States)

    Raskin, Sarah A; Woods, Steven Paul; Poquette, Amelia J; McTaggart, April B; Sethna, Jim; Williams, Rebecca C; Tröster, Alexander I

    2011-03-01

    The aim of the current study was to clarify the nature and extent of impairment in time- versus event-based prospective memory in Parkinson's disease (PD). Prospective memory is thought to involve cognitive processes that are mediated by prefrontal systems and are executive in nature. Given that individuals with PD frequently show executive dysfunction, it is important to determine whether these individuals may have deficits in prospective memory that could impact daily functions, such as taking medications. Although it has been reported that individuals with PD evidence impairment in prospective memory, it is still unclear whether they show a greater deficit for time- versus event-based cues. Fifty-four individuals with PD and 34 demographically similar healthy adults were administered a standardized measure of prospective memory that allows for a direct comparison of time-based and event-based cues. In addition, participants were administered a series of standardized measures of retrospective memory and executive functions. Individuals with PD demonstrated impaired prospective memory performance compared to the healthy adults, with a greater impairment demonstrated for the time-based tasks. Time-based prospective memory performance was moderately correlated with measures of executive functioning, but only the Stroop Neuropsychological Screening Test emerged as a unique predictor in a linear regression. Findings are interpreted within the context of McDaniel and Einstein's (2000) multiprocess theory to suggest that individuals with PD experience particular difficulty executing a future intention when the cue to execute the prescribed intention requires higher levels of executive control. (c) 2011 APA, all rights reserved

  7. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  8. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  9. Event-Based Control for Average Consensus of Wireless Sensor Networks with Stochastic Communication Noises

    Directory of Open Access Journals (Sweden)

    Chuan Ji

    2013-01-01

    Full Text Available This paper focuses on the average consensus problem for the wireless sensor networks (WSNs with fixed and Markovian switching, undirected and connected network topologies in the noise environment. Event-based protocol is applied to each sensor node to reach the consensus. An event triggering strategy is designed based on a Lyapunov function. Under the event trigger condition, some sufficient conditions for average consensus in mean square are obtained. Finally, some numerical simulations are given to illustrate the effectiveness of the results derived in this paper.

  10. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  11. Event-based prospective memory in mildly and severely autistic children.

    Science.gov (United States)

    Sheppard, Daniel P; Kvavilashvili, Lia; Ryder, Nuala

    2016-01-01

    There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. This study is the first to specifically compare event-based PM in severely autistic children to mildly autistic and typically developing children. Fourteen mildly autistic children and 14 severely autistic children, aged 5-13 years, were matched for educational attainment with 26 typically developing children aged 5-6 years. Three PM tasks and a retrospective memory task were administered. Results showed that severely autistic children performed less well than typically developing children on two PM tasks but mildly autistic children did not differ from either group. No group differences were found on the most motivating (a toy reward) task. The findings suggest naturalistic tasks and motivation are important factors in PM success in severely autistic children and highlights the need to consider the heterogeneity of autism and symptom severity in relation to performance on event-based PM tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. The link between alcohol use and aggression toward sexual minorities: an event-based analysis.

    Science.gov (United States)

    Parrott, Dominic J; Gallagher, Kathryn E; Vincent, Wilson; Bakeman, Roger

    2010-09-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men's alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18-30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexual minorities. Results indicated that aggression toward sexual minorities was twice as likely on a day when drinking was reported than on nondrinking days, with over 80% of alcohol-related aggressive acts perpetrated within the group context. Patterns of alcohol use (i.e., number of drinking days, mean drinks per drinking day, number of heavy drinking days) were not associated with perpetration after controlling for demographic variables and pertinent risk factors. Results suggest that it is the acute effects of alcohol, and not men's patterns of alcohol consumption, that facilitate aggression toward sexual minorities. More importantly, these data are the first to support an event-based link between alcohol use and aggression toward sexual minorities (or any minority group), and provide the impetus for future research to examine risk factors and mechanisms for intoxicated aggression toward sexual minorities and other stigmatized groups.

  13. Valenced cues and contexts have different effects on event-based prospective memory.

    Science.gov (United States)

    Graf, Peter; Yu, Martin

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  14. Valenced cues and contexts have different effects on event-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Peter Graf

    Full Text Available This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  15. Social importance enhances prospective memory: evidence from an event-based task.

    Science.gov (United States)

    Walter, Stefan; Meier, Beat

    2017-07-01

    Prospective memory performance can be enhanced by task importance, for example by promising a reward. Typically, this comes at costs in the ongoing task. However, previous research has suggested that social importance (e.g., providing a social motive) can enhance prospective memory performance without additional monitoring costs in activity-based and time-based tasks. The aim of the present study was to investigate the influence of social importance in an event-based task. We compared four conditions: social importance, promising a reward, both social importance and promising a reward, and standard prospective memory instructions (control condition). The results showed enhanced prospective memory performance for all importance conditions compared to the control condition. Although ongoing task performance was slowed in all conditions with a prospective memory task when compared to a baseline condition with no prospective memory task, additional costs occurred only when both the social importance and reward were present simultaneously. Alone, neither social importance nor promising a reward produced an additional slowing when compared to the cost in the standard (control) condition. Thus, social importance and reward can enhance event-based prospective memory at no additional cost.

  16. Event-based Plausibility Immediately Influences On-line Language Comprehension

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  17. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  18. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  19. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  20. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621. Keywords.

  1. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  2. Event-based prospective memory in children with sickle cell disease: effect of cue distinctiveness.

    Science.gov (United States)

    McCauley, Stephen R; Pedroza, Claudia

    2010-01-01

    Event-based prospective memory (EB-PM) is the formation of an intention and remembering to perform it in response to a specific event. Currently, EB-PM performance in children with sickle cell disease (SCD) is unknown. In this study, we designed a computer-based task of EB-PM; No-Stroke, Silent-Infarct, and Overt-Stroke groups performed significantly below the demographically similar control group without SCD. Cue distinctiveness was varied to determine if EB-PM could be improved. All groups, with the exception of the Overt-Stroke group, performed significantly better with a perceptually distinctive cue. Overall, these results suggest that EB-PM can be improved significantly in many children with SCD.

  3. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    Science.gov (United States)

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  4. Contributions of cerebellar event-based temporal processing and preparatory function to speech perception.

    Science.gov (United States)

    Schwartze, Michael; Kotz, Sonja A

    2016-10-01

    The role of the cerebellum in the anatomical and functional architecture of the brain is a matter of ongoing debate. We propose that cerebellar temporal processing contributes to speech perception on a number of accounts: temporally precise cerebellar encoding and rapid transmission of an event-based representation of the temporal structure of the speech signal serves to prepare areas in the cerebral cortex for the subsequent perceptual integration of sensory information. As speech dynamically evolves in time this fundamental preparatory function may extend its scope to the predictive allocation of attention in time and supports the fine-tuning of temporally specific models of the environment. In this framework, an oscillatory account considering a range of frequencies may best serve the linking of the temporal and speech processing systems. Lastly, the concerted action of these processes may not only advance predictive adaptation to basic auditory dynamics but optimize the perceptual integration of speech. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Stabilization of Networked Distributed Systems with Partial and Event-Based Couplings

    Directory of Open Access Journals (Sweden)

    Sufang Zhang

    2015-01-01

    Full Text Available The stabilization problem of networked distributed systems with partial and event-based couplings is investigated. The channels, which are used to transmit different levels of information of agents, are considered. The channel matrix is introduced to indicate the work state of the channels. An event condition is designed for each channel to govern the sampling instants of the channel. Since the event conditions are separately given for different channels, the sampling instants of channels are mutually independent. To stabilize the system, the state feedback controllers are implemented in the system. The control signals also suffer from the two communication constraints. The sufficient conditions in terms of linear matrix equalities are proposed to ensure the stabilization of the controlled system. Finally, a numerical example is given to demonstrate the advantage of our results.

  6. Polar Codes

    Science.gov (United States)

    2014-12-01

    added by the decoder is K/ρ+Td. By the last assumption, Td and Te are both ≤ K/ρ, so the total latency added is between 2K/ρ and 4K /ρ. For example...better resolution near the decision point. Reference [12] showed that in decoding a (1024, 512) polar code, using 6-bit LLRs resulted in per- formance

  7. Convolutional-Code-Specific CRC Code Design

    OpenAIRE

    Lou, Chung-Yu; Daneshrad, Babak; Wesel, Richard D.

    2015-01-01

    Cyclic redundancy check (CRC) codes check if a codeword is correctly received. This paper presents an algorithm to design CRC codes that are optimized for the code-specific error behavior of a specified feedforward convolutional code. The algorithm utilizes two distinct approaches to computing undetected error probability of a CRC code used with a specific convolutional code. The first approach enumerates the error patterns of the convolutional code and tests if each of them is detectable. Th...

  8. Improvement of hydrological flood forecasting through an event based output correction method

    Science.gov (United States)

    Klotz, Daniel; Nachtnebel, Hans Peter

    2014-05-01

    This contribution presents an output correction method for hydrological models. A conceptualisation of the method is presented and tested in an alpine basin in Salzburg, Austria. The aim is to develop a method which is not prone to the drawbacks of autoregressive models. Output correction methods are an attractive option for improving hydrological predictions. They are complementary to the main modelling process and do not interfere with the modelling process itself. In general, output correction models estimate the future error of a prediction and use the estimation to improve the given prediction. Different estimation techniques are available dependent on the utilized information and the estimation procedure itself. Autoregressive error models are widely used for such corrections. Autoregressive models with exogenous inputs (ARX) allow the use of additional information for the error modelling, e.g. measurements from upper basins or predicted input-signals. Autoregressive models do however exhibit deficiencies, since the errors of hydrological models do generally not behave in an autoregressive manner. The decay of the error is usually different from an autoregressive function and furthermore the residuals exhibit different patterns under different circumstances. As for an example, one might consider different error-propagation behaviours under high- and low-flow situations or snow melt driven conditions. This contribution presents a conceptualisation of an event-based correction model and focuses on flood events only. The correction model uses information about the history of the residuals and exogenous variables to give an error-estimation. The structure and parameters of the correction models can be adapted to given event classes. An event-class is a set of flood events that exhibit a similar pattern for the residuals or the hydrological conditions. In total, four different event-classes have been identified in this study. Each of them represents a different

  9. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  10. Concatenated codes with convolutional inner codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Zyablov, Viktor

    1988-01-01

    The minimum distance of concatenated codes with Reed-Solomon outer codes and convolutional inner codes is studied. For suitable combinations of parameters the minimum distance can be lower-bounded by the product of the minimum distances of the inner and outer codes. For a randomized ensemble...... of concatenated codes a lower bound of the Gilbert-Varshamov type is proved...

  11. Ant colony optimization and event-based dynamic task scheduling and staffing for software projects

    Science.gov (United States)

    Ellappan, Vijayan; Ashwini, J.

    2017-11-01

    In programming change organizations from medium to inconceivable scale broadens, the issue of wander orchestrating is amazingly unusual and testing undertaking despite considering it a manual system. Programming wander-organizing requirements to deal with the issue of undertaking arranging and in addition the issue of human resource portion (also called staffing) in light of the way that most of the advantages in programming ventures are individuals. We propose a machine learning approach with finds respond in due order regarding booking by taking in the present arranging courses of action and an event based scheduler revives the endeavour arranging system moulded by the learning computation in perspective of the conformity in event like the begin with the Ander, the instant at what time possessions be free starting to ended errands, and the time when delegates stick together otherwise depart the wander inside the item change plan. The route toward invigorating the timetable structure by the even based scheduler makes the arranging method dynamic. It uses structure components to exhibit the interrelated surges of endeavours, slip-ups and singular all through different progression organizes and is adjusted to mechanical data. It increases past programming wander movement ask about by taking a gander at a survey based process with a one of a kind model, organizing it with the data based system for peril assessment and cost estimation, and using a choice showing stage.

  12. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  13. Hydrologic Modeling in the Kenai River Watershed using Event Based Calibration

    Science.gov (United States)

    Wells, B.; Toniolo, H. A.; Stuefer, S. L.

    2015-12-01

    Understanding hydrologic changes is key for preparing for possible future scenarios. On the Kenai Peninsula in Alaska the yearly salmon runs provide a valuable stimulus to the economy. It is the focus of a large commercial fishing fleet, but also a prime tourist attraction. Modeling of anadromous waters provides a tool that assists in the prediction of future salmon run size. Beaver Creek, in Kenai, Alaska, is a lowlands stream that has been modeled using the Army Corps of Engineers event based modeling package HEC-HMS. With the use of historic precipitation and discharge data, the model was calibrated to observed discharge values. The hydrologic parameters were measured in the field or calculated, while soil parameters were estimated and adjusted during the calibration. With the calibrated parameter for HEC-HMS, discharge estimates can be used by other researches studying the area and help guide communities and officials to make better-educated decisions regarding the changing hydrology in the area and the tied economic drivers.

  14. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    Science.gov (United States)

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component.

  15. Utilization of time varying event-based customer interruption cost load shedding schemes

    Energy Technology Data Exchange (ETDEWEB)

    Wangdee, Wijarn; Billinton, Roy [University of Saskatchewan, Saskatoon (Canada). Power System Research Group, Department of Electrical Engineering

    2005-12-01

    Load curtailments occurring under emergency conditions can have significant monetary impacts on the system customers. Customer satisfaction is becoming increasingly important in the new deregulated electric utility environment, and the customers in some jurisdictions are beginning to receive monetary compensation for power supply failures. Minimizing the customer interruption costs associated with a load curtailment event is an important factor in maintaining customer satisfaction. Customer interruption costs depend on many factors such as the customer types interrupted, the actual load demand at the time of the outage, the duration of the outage, the time of day and the day in which the outage occurs. This paper focuses on incorporating these interruption cost factors in a load shedding strategy. The load shedding algorithm was developed using an approximate event-based customer interruption cost evaluation technique to identify and determine the priority of the distribution feeders on a given bus during an emergency. The developed algorithm incorporates a time dependent feeder cost priority index (FCP). The optimum load shedding set determined using the FCP is a feeder or group of feeders that meet a capacity deficiency, and result in the lowest customer interruption cost for the specified emergency situation. This paper illustrates the algorithm development for a load shedding scheme and demonstrates the utilization of the technique on a sample load bus. (author)

  16. Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.; Chung, James S.; Walters, Ronald A.; Arthur, Ray; Bruckner-Lea, Cindy J.; Calapristi, Augustin J.; Dowling, Glenn; Hartley, David M.; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K.; McKenzie, Taylor K.; Nelson, Noele P.; Olsen, Jennifer; Pancerella, Carmen M.; Quitugua, Teresa N.; Reed, Jeremy T.; Thomas, Carla S.

    2012-03-28

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e., aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.

  17. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  18. Event-Based Impulsive Control of Continuous-Time Dynamic Systems and Its Application to Synchronization of Memristive Neural Networks.

    Science.gov (United States)

    Zhu, Wei; Wang, Dandan; Liu, Lu; Feng, Gang

    2017-08-18

    This paper investigates exponential stabilization of continuous-time dynamic systems (CDSs) via event-based impulsive control (EIC) approaches, where the impulsive instants are determined by certain state-dependent triggering condition. The global exponential stability criteria via EIC are derived for nonlinear and linear CDSs, respectively. It is also shown that there is no Zeno-behavior for the concerned closed loop control system. In addition, the developed event-based impulsive scheme is applied to the synchronization problem of master and slave memristive neural networks. Furthermore, a self-triggered impulsive control scheme is developed to avoid continuous communication between the master system and slave system. Finally, two numerical simulation examples are presented to illustrate the effectiveness of the proposed event-based impulsive controllers.

  19. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Directory of Open Access Journals (Sweden)

    I. Andrés-Doménech

    2010-10-01

    Full Text Available Since Water Framework Directive (WFD was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain. First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+, within the river just after the spills.

  20. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach.

    Science.gov (United States)

    Zhang, Hongxia; Tang, Weihai; Liu, Xiping

    2017-01-01

    Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM). This study investigated how EBPM performance is affected by task duration by having university students ( n = 223) perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task's duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM) cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1) Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2) As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3) The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  1. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongxia Zhang

    2017-11-01

    Full Text Available Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM. This study investigated how EBPM performance is affected by task duration by having university students (n = 223 perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1 Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2 As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3 The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  2. Agreement between event-based and trend-based glaucoma progression analyses.

    Science.gov (United States)

    Rao, H L; Kumbar, T; Kumar, A U; Babu, J G; Senthil, S; Garudadri, C S

    2013-07-01

    To evaluate the agreement between event- and trend-based analyses to determine visual field (VF) progression in glaucoma. VFs of 175 glaucoma eyes with ≥5 VFs were analyzed by proprietary software of VF analyzer to determine progression. Agreement (κ) between trend-based analysis of VF index (VFI) and event-based analysis (glaucoma progression analysis, GPA) was evaluated. For eyes progressing by event- and trend-based methods, time to progression by two methods was calculated. Median number of VFs per eye was 7 and follow-up 7.5 years. GPA classified 101 eyes (57.7%) as stable, 30 eyes (17.1%) as possible and 44 eyes (25.2%) as likely progression. Trend-based analysis classified 122 eyes (69.7%) as stable (slope >-1% per year or any slope magnitude with P>0.05), 53 eyes (30.3%) as progressing with slope trend-based analysis was 0.48, and between specific criteria of GPA (possible clubbed with no progression) and trend-based analysis was 0.50. In eyes progressing by sensitive criteria of both methods (42 eyes), median time to progression by GPA (4.9 years) was similar (P=0.30) to trend-based method (5.0 years). This was also similar in eyes progressing by specific criteria of both methods (25 eyes; 5.6 years versus 5.9 years, P=0.23). Agreement between event- and trend-based progression analysis was moderate. GPA seemed to detect progression earlier than trend-based analysis, but this wasn't statistically significant.

  3. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  4. Study of Event-based Sampling Techniques and Their Influence on Greenhouse Climate Control with Wireless Sensors Network

    OpenAIRE

    Pawlowski, Andrzej; Guzman, Jose L.; Rodriguez, Francisco; Berenguel, Manuel; Sanchez, Jose; Dormido, Sebastian

    2010-01-01

    This paper presents a study of event-based sampling techniques and their application to the greenhouse climate control problem. It was possible to obtain important information about data transmission and control performance for all techniques. As conclusion, it was deduced

  5. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  6. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    Science.gov (United States)

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4

  7. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  8. Modelling human exposure to space radiation with different shielding: the FLUKA code coupled with anthropomorphic phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F [Department of Nuclear and Theoretical Physics, University of Pavia (Italy); Alloni, D [Department of Nuclear and Theoretical Physics, University of Pavia (Italy); Battistoni, G [INFN - National Institute of Nuclear Physics, (Italy); Cerutti, F [INFN - National Institute of Nuclear Physics (Italy)] (and others)

    2006-05-15

    Astronauts' exposure to the various components of the space radiation field is of great concern for long-term missions, especially for those in deep space such as a possible travel to Mars. Simulations based on radiation transport/interaction codes coupled with anthropomorphic model phantoms can be of great help in view of risk evaluation and shielding optimisation, which is therefore a crucial issue. The FLUKA Monte Carlo code can be coupled with two types of anthropomorphic phantom (a mathematical model and a 'voxel' model) to calculate organ-averaged absorbed dose, dose equivalent and 'biological' dose under different shielding conditions. Herein the 'biological dose' is represented by the average number of 'Complex Lesions' (CLs) per cell in a given organ. CLs are clustered DNA breaks previously calculated by means of event-by-event track structure simulations at the nm level and integrated on-line into FLUKA, which adopts a condensed-history approach; such lesions have been shown to play a fundamental role in chromosome aberration induction, which in turn can be correlated with carcinogenesis. Examples of calculation results will be presented relative to Galactic Cosmic Rays, as well as to the August 1972 Solar Particle Event. The contributions from primary ions and secondary particles will be shown separately, thus allowing quantification of the role played by nuclear reactions occurring in the shield and in the human body itself. As expected, the SPE doses decrease dramatically with increasing the Al shielding thickness; nuclear reaction products, essentially due to target fragmentation, are of minor importance. A 10 g/cm{sup 2} Al shelter resulted to be sufficient to respect the 30-day limits for deterministic effects recommended for missions in Low Earth Orbit. In contrast with the results obtained for SPEs, the calculated GCR doses are almost independent of the Al shield thickness, and the GCR doses to internal

  9. Event-based sampling for reducing communication load in realtime human motion analysis by wireless inertial sensor networks

    Directory of Open Access Journals (Sweden)

    Laidig Daniel

    2016-09-01

    Full Text Available We examine the usefulness of event-based sampling approaches for reducing communication in inertial-sensor-based analysis of human motion. To this end we consider realtime measurement of the knee joint angle during walking, employing a recently developed sensor fusion algorithm. We simulate the effects of different event-based sampling methods on a large set of experimental data with ground truth obtained from an external motion capture system. This results in a reduced wireless communication load at the cost of a slightly increased error in the calculated angles. The proposed methods are compared in terms of best balance of these two aspects. We show that the transmitted data can be reduced by 66% while maintaining the same level of accuracy.

  10. Event-based prospective memory among veterans: The role of posttraumatic stress disorder symptom severity in executing intentions.

    Science.gov (United States)

    McFarland, Craig P; Clark, Justin B; Lee, Lewina O; Grande, Laura J; Marx, Brian P; Vasterling, Jennifer J

    2016-01-01

    Posttraumatic stress disorder (PTSD) has been linked with neuropsychological deficits in several areas, including attention, learning and memory, and cognitive inhibition. Although memory dysfunction is among the most commonly documented deficits associated with PTSD, our existing knowledge pertains only to retrospective memory. The current study investigated the relationship between PTSD symptom severity and event-based prospective memory (PM). Forty veterans completed a computerized event-based PM task, a self-report measure of PTSD, and measures of retrospective memory. Hierarchical regression analysis results revealed that PTSD symptom severity accounted for 16% of the variance in PM performance, F(3, 36) = 3.47, p memory. Additionally, each of the three PTSD symptom clusters was related, to varying degrees, with PM performance. Results suggest that elevated PTSD symptoms may be associated with more difficulties completing tasks requiring PM. Further examination of PM in PTSD is warranted, especially in regard to its impact on everyday functioning.

  11. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show...

  12. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is propose...... including an outer BCH code correcting a few bit errors.......The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  13. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  14. Generalized concatenated quantum codes

    Science.gov (United States)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng, Bei

    2009-05-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  15. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  16. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  17. A study of preservice elementary teachers enrolled in a discrepant-event-based physical science class

    Science.gov (United States)

    Lilly, James Edward

    This research evaluated the POWERFUL IDEAS IN PHYSICAL SCIENCE (PIiPS) curriculum model used to develop a physical science course taken by preservice elementary teachers. The focus was on the evaluation of discrepant events used to induce conceptual change in relation to students' ideas concerning heat, temperature, and specific heat. Both quantitative and qualitative methodologies were used for the analysis. Data was collected during the 1998 Fall semester using two classes of physical science for elementary school teachers. The traditionally taught class served as the control group and the class using the PIiPS curriculum model was the experimental group. The PIiPS curriculum model was evaluated quantitatively for its influence on students' attitude toward science, anxiety towards teaching science, self efficacy toward teaching science, and content knowledge. An analysis of covariance was performed on the quantitative data to test for significant differences between the means of the posttests for the control and experimental groups while controlling for pretest. It was found that there were no significant differences between the means of the control and experimental groups with respect to changes in their attitude toward science, anxiety toward teaching science and self efficacy toward teaching science. A significant difference between the means of the content examination was found (F(1,28) = 14.202 and p = 0.001), however, the result is questionable. The heat and energy module was the target for qualitative scrutiny. Coding for discrepant events was adapted from Appleton's 1996 work on student's responses to discrepant event science lessons. The following qualitative questions were posed for the investigation: (1) what were the ideas of the preservice elementary students prior to entering the classroom regarding heat and energy, (2) how effective were the discrepant events as presented in the PIiPS heat and energy module, and (3) how much does the "risk taking

  18. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  19. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  20. Monetary Incentive Effects on Event-Based Prospective Memory Three Months after Traumatic Brain Injury in Children

    OpenAIRE

    McCauley, Stephen R.; Pedroza, Claudia; Chapman, Sandra B.; Cook, Lori G.; Vásquez, Ana C.; Levin, Harvey S.

    2011-01-01

    Information regarding the remediation of event-based prospective memory (EB-PM) impairments following pediatric traumatic brain injury (TBI) is scarce. Addressing this, two levels of monetary incentives were used to improve EB-PM in children ages 7 to 16 years with orthopedic injuries (OI, n = 51), or moderate (n = 25), and severe (n = 39) TBI at approximately three months postinjury. The EB-PM task consisted of the child giving a specific verbal response to a verbal cue from the examiner whi...

  1. Algebraic geometric codes

    Science.gov (United States)

    Shahshahani, M.

    1991-01-01

    The performance characteristics are discussed of certain algebraic geometric codes. Algebraic geometric codes have good minimum distance properties. On many channels they outperform other comparable block codes; therefore, one would expect them eventually to replace some of the block codes used in communications systems. It is suggested that it is unlikely that they will become useful substitutes for the Reed-Solomon codes used by the Deep Space Network in the near future. However, they may be applicable to systems where the signal to noise ratio is sufficiently high so that block codes would be more suitable than convolutional or concatenated codes.

  2. Monomial-like codes

    CERN Document Server

    Martinez-Moro, Edgar; Ozbudak, Ferruh; Szabo, Steve

    2010-01-01

    As a generalization of cyclic codes of length p^s over F_{p^a}, we study n-dimensional cyclic codes of length p^{s_1} X ... X p^{s_n} over F_{p^a} generated by a single "monomial". Namely, we study multi-variable cyclic codes of the form in F_{p^a}[x_1...x_n] / . We call such codes monomial-like codes. We show that these codes arise from the product of certain single variable codes and we determine their minimum Hamming distance. We determine the dual of monomial-like codes yielding a parity check matrix. We also present an alternative way of constructing a parity check matrix using the Hasse derivative. We study the weight hierarchy of certain monomial like codes. We simplify an expression that gives us the weight hierarchy of these codes.

  3. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  4. Radiation transport simulation of the Martian GCR surface flux and dose estimation using spherical geometry in PHITS compared to MSL-RAD measurements

    Science.gov (United States)

    Flores-McLaughlin, John

    2017-08-01

    Planetary bodies and spacecraft are predominantly exposed to isotropic radiation environments that are subject to transport and interaction in various material compositions and geometries. Specifically, the Martian surface radiation environment is composed of galactic cosmic radiation, secondary particles produced by their interaction with the Martian atmosphere, albedo particles from the Martian regolith and occasional solar particle events. Despite this complex physical environment with potentially significant locational and geometric dependencies, computational resources often limit radiation environment calculations to a one-dimensional or slab geometry specification. To better account for Martian geometry, spherical volumes with respective Martian material densities are adopted in this model. This physical description is modeled with the PHITS radiation transport code and compared to a portion of measurements from the Radiation Assessment Detector of the Mars Science Laboratory. Particle spectra measured between 15 November 2015 and 15 January 2016 and PHITS model results calculated for this time period are compared. Results indicate good agreement between simulated dose rates, proton, neutron and gamma spectra. This work was originally presented at the 1st Mars Space Radiation Modeling Workshop held in 2016 in Boulder, CO.

  5. Incentive effects on event-based prospective memory performance in children and adolescents with traumatic brain injury.

    Science.gov (United States)

    McCauley, Stephen R; McDaniel, Mark A; Pedroza, Claudia; Chapman, Sandra B; Levin, Harvey S

    2009-03-01

    Prospective memory (PM) is the formation of an intention and remembering to perform this intention at a future time or in response to specific cues. PM tasks are a ubiquitous part of daily life. Currently, there is a paucity of information regarding PM impairments in children with traumatic brain injury (TBI) and less empirical evidence regarding effective remediation strategies to mitigate these impairments. The present study employed two levels of a motivational enhancement (i.e., a monetary incentive) to determine whether event-based PM could be improved in children with severe TBI. In a crossover design, children with orthopedic injuries and mild or severe TBI were compared on two levels of incentive (dollars vs. pennies) given in response to accurate performance. All three groups performed significantly better under the high- versus low-motivation conditions. However, the severe TBI group's high-motivation condition performance remained significantly below the low-motivation condition performance of the orthopedic injury group. PM scores were positively and significantly related to age-at-test, but there were no age-at-injury or time-postinjury effects. Overall, these results suggest that event-based PM can be significantly improved in children with severe TBI.

  6. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  7. Influence of lag time on event-based rainfall-runoff modeling using the data driven approach

    Science.gov (United States)

    Talei, Amin; Chua, Lloyd H. C.

    2012-05-01

    SummaryThis study investigated the effect of lag time on the performance of data-driven models, specifically the adaptive network-based fuzzy inference system (ANFIS), in event-based rainfall-runoff modeling. Rainfall and runoff data for a catchment in Singapore were chosen for this study. For the purpose of this study, lag time was determined from cross-correlation analysis of the rainfall and runoff time series. Rainfall antecedents were the only inputs of the models and direct runoff was the desired output. An ANFIS model with three sub-models defined based on three different ranges of lag times was developed. The performance of the sub-models was compared with previously developed ANFIS models and the physically-based Storm Water Management Model (SWMM). The ANFIS sub-models gave significantly superior results in terms of the RMSE, r2, CE and the prediction of the peak discharge, compared to other ANFIS models where the lag time was not considered. In addition, the ANFIS sub-models provided results that were comparable with results from SWMM. It is thus concluded that the lag time plays an important role in the selection of events for training and testing of data-driven models in event-based rainfall-runoff modeling.

  8. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data.

    Science.gov (United States)

    Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H

    2011-01-07

    No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the

  9. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    Directory of Open Access Journals (Sweden)

    Epstein Richard H

    2011-01-01

    Full Text Available Abstract Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1. Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS data for most scenarios (43 of 45. Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of

  10. TIPONLINE Code Table

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coded items are entered in the tiponline data entry program. The codes and their explanations are necessary in order to use the data

  11. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  12. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  13. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  14. Monetary incentive effects on event-based prospective memory three months after traumatic brain injury in children.

    Science.gov (United States)

    McCauley, Stephen R; Pedroza, Claudia; Chapman, Sandra B; Cook, Lori G; Vásquez, Ana C; Levin, Harvey S

    2011-07-01

    Information regarding the remediation of event-based prospective memory (EB-PM) impairments following pediatric traumatic brain injury (TBI) is scarce. Addressing this, two levels of monetary incentives were used to improve EB-PM in children ages 7 to 16 years with orthopedic injuries (OI, n = 51), or moderate (n = 25) and severe (n = 39) TBI at approximately 3 months postinjury. The EB-PM task consisted of the child giving a specific verbal response to a verbal cue from the examiner while performing a battery of neuropsychological measures (ongoing task). Significant effects were found for age-at-test, motivation condition, period, and group. Within-group analyses indicated that OI and moderate TBI groups performed significantly better under the high- than under the low-incentive condition, but the severe TBI group demonstrated no significant improvement. These results indicate that EB-PM can be significantly improved at 3 months postinjury in children with moderate, but not severe, TBI.

  15. Flexible readout and integration sensor (FRIS): a bio-inspired, system-on-chip, event-based readout architecture

    Science.gov (United States)

    Lin, Joseph H.; Pouliquen, Philippe O.; Andreou, Andreas G.; Goldberg, Arnold C.; Rizk, Charbel G.

    2012-06-01

    We present a bio-inspired system-on-chip focal plane readout architecture which at the system level, relies on an event based sampling scheme where only pixels within a programmable range of photon flux rates are output. At the pixel level, a one bit oversampled analog-to-digital converter together with a decimator allows for the quantization of signals up to 26 bits. Furthermore, digital non-uniformity correction of both gain and offset errors is applied at the pixel level prior to readout. We report test results for a prototype array fabricated in a standard 90nm CMOS process. Tests performed at room and cryogenic temperatures demonstrate the capability to operate at a temporal noise ratio as low as 1.5, an electron well capacity over 100Ge-, and an ADC LSB down to 1e-.

  16. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  17. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  18. The Procions` code; Le code Procions

    Energy Technology Data Exchange (ETDEWEB)

    Deck, D.; Samba, G.

    1994-12-19

    This paper presents a new code to simulate plasmas generated by inertial confinement. This multi-kinds kinetic code is done with no angular approximation concerning ions and will work in plan and spherical geometry. First, the physical model is presented, using Fokker-Plank. Then, the numerical model is introduced in order to solve the Fokker-Plank operator under the Rosenbluth form. At the end, several numerical tests are proposed. (TEC). 17 refs., 27 figs.

  19. Prediction of event-based stormwater runoff quantity and quality by ANNs developed using PMI-based input selection

    Science.gov (United States)

    He, Jianxun; Valeo, Caterina; Chu, Angus; Neumann, Norman F.

    2011-03-01

    SummaryEvent-based stormwater runoff quantity and quality modeling remains a challenge since the processes of rainfall induced pollutant discharge are not completely understood. The complexity of physically-based models often limits the practical use of quality models in practice. Artificial neural networks (ANNs) are a data driven modeling approach that can avoid the necessity of fully understanding complex physical processes. In this paper, feed-forward multi-layer perceptron (MLP) network, a popular type of ANN, was applied to predict stormwater runoff quantity and quality including turbidity, specific conductance, water temperature, pH, and dissolved oxygen (DO) in storm events. A recently proposed input selection algorithm based on partial mutual information (PMI), which identifies input variables in a stepwise manner, was employed to select input variable sets for the development of ANNs. The ANNs developed via this approach could produce satisfactory prediction of event-based stormwater runoff quantity and quality. In particular, this approach demonstrated a superior performance over the approach involving ANNs fed by inputs selected using partial correlation and all potential inputs in flow modeling. This result suggests the applicability of PMI in developing ANN models. In addition, the ANN for flow outperformed conventional multiple linear regression (MLR) and multiple nonlinear regression (MNLR) models. For an ANN development of turbidity (multiplied by flow rate) and specific conductance, significant improvement was achieved by including a previous 3-week total rainfall amount into their input variable sets. This antecedent rainfall variable is considered a factor in the availability of land surface pollutants for wash-off. A sensitivity analysis demonstrated the potential role of this rainfall variable on modeling particulate solids and dissolved matters in stormwater runoff.

  20. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  1. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  2. Noisy Network Coding

    CERN Document Server

    Lim, Sung Hoon; Gamal, Abbas El; Chung, Sae-Young

    2010-01-01

    A noisy network coding scheme for sending multiple sources over a general noisy network is presented. For multi-source multicast networks, the scheme naturally extends both network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to general discrete memoryless and Gaussian networks. The scheme also recovers as special cases the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves message repetition coding, relay signal compression, and simultaneous decoding. Unlike previous compress--forward schemes, where independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner--Ziv binning as in previous compress-forward sch...

  3. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  4. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  5. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded......We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when....... Instead we propose that the promise of coding-based research lies in its ability to open up new qualitative questions....

  6. Overview of Code Verification

    Science.gov (United States)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  7. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  8. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  9. Generating code adapted for interlinking legacy scalar code and extended vector code

    Science.gov (United States)

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  10. Decoding of Cyclic Codes,

    Science.gov (United States)

    INFORMATION THEORY, *DECODING), (* DATA TRANSMISSION SYSTEMS , DECODING), STATISTICAL ANALYSIS, STOCHASTIC PROCESSES, CODING, WHITE NOISE, NUMBER THEORY, CORRECTIONS, BINARY ARITHMETIC, SHIFT REGISTERS, CONTROL SYSTEMS, USSR

  11. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  12. Diameter Perfect Lee Codes

    CERN Document Server

    Horak, Peter

    2011-01-01

    Lee codes have been intensively studied for more than 40 years. Interest in these codes has been triggered by the Golomb-Welch conjecture on the existence of perfect error-correcting Lee codes. In this paper we deal with the existence and enumeration of diameter perfect Lee codes. As main results we determine all q for which there exists a linear diameter-4 perfect Lee code of word length n over Z_{q}, and prove that for each n\\geq3 there are unaccountably many diameter-4 perfect Lee codes of word length n over Z. This is in a strict contrast with perfect error-correcting Lee codes of word length n over Z as there is a unique such code for n=3, and its is conjectured that this is always the case when 2n+1 is a prime. Diameter perfect Lee codes will be constructed by an algebraic construction that is based on a group homomorphism. This will allow us to design an efficient algorithm for their decoding.

  13. Expander chunked codes

    Science.gov (United States)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  14. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  15. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  16. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-01

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  17. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems.

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-12

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  18. On {\\sigma}-LCD codes

    OpenAIRE

    Carlet, Claude; Mesnager, Sihem; Tang, Chunming; Qi, Yanfeng

    2017-01-01

    Linear complementary pairs (LCP) of codes play an important role in armoring implementations against side-channel attacks and fault injection attacks. One of the most common ways to construct LCP of codes is to use Euclidean linear complementary dual (LCD) codes. In this paper, we first introduce the concept of linear codes with $\\sigma$ complementary dual ($\\sigma$-LCD), which includes known Euclidean LCD codes, Hermitian LCD codes, and Galois LCD codes. As Euclidean LCD codes, $\\sigma$-LCD ...

  19. Sexual frequency and planning among at-risk men who have sex with men in the United States: implications for event-based intermittent pre-exposure prophylaxis.

    Science.gov (United States)

    Volk, Jonathan E; Liu, Albert; Vittinghoff, Eric; Irvin, Risha; Kroboth, Elizabeth; Krakower, Douglas; Mimiaga, Matthew J; Mayer, Kenneth H; Sullivan, Patrick S; Buchbinder, Susan P

    2012-09-01

    Intermittent dosing of pre-exposure prophylaxis (iPrEP) has potential to decrease costs, improve adherence, and minimize toxicity. Practical event-based dosing of iPrEP requires men who have sex with men (MSM) to be sexually active on fewer than 3 days each week and plan for sexual activity. MSM who may be most suitable for event-based dosing were older, more educated, more frequently used sexual networking websites, and more often reported that their last sexual encounter was not with a committed partner. A substantial proportion of these MSM endorse high-risk sexual activity, and event-based iPrEP may best target this population.

  20. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  1. Error Correcting Codes

    Indian Academy of Sciences (India)

    be fixed to define codes over such domains). New decoding schemes that take advantage of such connections can be devised. These may soon show up in a technique called code division multiple access (CDMA) which is proposed as a basis for digital cellular communication. CDMA provides a facility for many users to ...

  2. Codes of Conduct

    Science.gov (United States)

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  3. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  4. Code Generation = A* + BURS

    NARCIS (Netherlands)

    Nymeyer, Albert; Katoen, Joost P.; Westra, Ymte; Alblas, H.; Gyimóthy, Tibor

    1996-01-01

    A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived

  5. Dress Codes for Teachers?

    Science.gov (United States)

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  6. Informal control code logic

    NARCIS (Netherlands)

    Bergstra, J.A.

    2010-01-01

    General definitions as well as rules of reasoning regarding control code production, distribution, deployment, and usage are described. The role of testing, trust, confidence and risk analysis is considered. A rationale for control code testing is sought and found for the case of safety critical

  7. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  8. Nuremberg code turns 60

    OpenAIRE

    Thieren, Michel; Mauron, Alexandre

    2007-01-01

    This month marks sixty years since the Nuremberg code – the basic text of modern medical ethics – was issued. The principles in this code were articulated in the context of the Nuremberg trials in 1947. We would like to use this anniversary to examine its ability to address the ethical challenges of our time.

  9. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 1. Error Correcting Codes The Hamming Codes. Priti Shankar. Series Article Volume 2 Issue 1 January ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  10. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  11. Quantum Synchronizable Codes From Quadratic Residue Codes and Their Supercodes

    OpenAIRE

    Xie, Yixuan; Yuan, Jinhong; Fujiwara, Yuichiro

    2014-01-01

    Quantum synchronizable codes are quantum error-correcting codes designed to correct the effects of both quantum noise and block synchronization errors. While it is known that quantum synchronizable codes can be constructed from cyclic codes that satisfy special properties, only a few classes of cyclic codes have been proved to give promising quantum synchronizable codes. In this paper, using quadratic residue codes and their supercodes, we give a simple construction for quantum synchronizable...

  12. Benefits and limitations of data assimilation for discharge forecasting using an event-based rainfall–runoff model

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2013-03-01

    Full Text Available Mediterranean catchments in southern France are threatened by potentially devastating fast floods which are difficult to anticipate. In order to improve the skill of rainfall-runoff models in predicting such flash floods, hydrologists use data assimilation techniques to provide real-time updates of the model using observational data. This approach seeks to reduce the uncertainties present in different components of the hydrological model (forcing, parameters or state variables in order to minimize the error in simulated discharges. This article presents a data assimilation procedure, the best linear unbiased estimator (BLUE, used with the goal of improving the peak discharge predictions generated by an event-based hydrological model Soil Conservation Service lag and route (SCS-LR. For a given prediction date, selected model inputs are corrected by assimilating discharge data observed at the basin outlet. This study is conducted on the Lez Mediterranean basin in southern France. The key objectives of this article are (i to select the parameter(s which allow for the most efficient and reliable correction of the simulated discharges, (ii to demonstrate the impact of the correction of the initial condition upon simulated discharges, and (iii to identify and understand conditions in which this technique fails to improve the forecast skill. The correction of the initial moisture deficit of the soil reservoir proves to be the most efficient control parameter for adjusting the peak discharge. Using data assimilation, this correction leads to an average of 12% improvement in the flood peak magnitude forecast in 75% of cases. The investigation of the other 25% of cases points out a number of precautions for the appropriate use of this data assimilation procedure.

  13. Canonical event based Bayesian model averaging for post-processing of multi-model ensemble precipitation forecasts

    Science.gov (United States)

    Li, Wentao; Duan, Qingyun

    2017-04-01

    Precipitation forecasts from numerical weather models usually contain biases in terms of mean and spread, and need to be post-processed before applying them as input to hydrological models. Bayesian Model Averaging (BMA) method is a widely used method for post-processing forecasts from multiple models. Traditionally, BMA is applied to time series of forecasts for a specific lead time directly. In this work, we propose to apply BMA based on "canonical events", which are precipitation events with specific lead times and durations to fully extract information from raw forecasts. For example, canonical events can be designed as the daily precipitation for day 1 to day 5, and the aggregation or average of total precipitation from day 6 to day 10, because forecasts beyond 5 day still have some skill but not as reliable as the first five days. Moreover, BMA parameters are traditionally calibrated using a moving window containing the forecast-observation pairs before a given forecast date, which cannot ensure similar meteorological condition when long training period is applied. In this work, the training dataset is chosen from the historical hindcast archive of forecast-observation pairs in a pre-specified time window surrounding a given forecast date. After all canonical events of different lead times and durations are calibrated for BMA models, ensemble members are generated from the calibrated probability forecasts using the Schaake shuffle to preserve the temporal dependency of forecasts for different lead times. This canonical event based BMA makes use of forecasts at different lead times more adequately and can generate continuous calibrated forecast time series for further application in hydrological modeling.

  14. Pyramid image codes

    Science.gov (United States)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  15. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  16. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  17. Quantum coding theorems

    Science.gov (United States)

    Holevo, A. S.

    1998-12-01

    ContentsI. IntroductionII. General considerations § 1. Quantum communication channel § 2. Entropy bound and channel capacity § 3. Formulation of the quantum coding theorem. Weak conversionIII. Proof of the direct statement of the coding theorem § 1. Channels with pure signal states § 2. Reliability function § 3. Quantum binary channel § 4. Case of arbitrary states with bounded entropyIV. c-q channels with input constraints § 1. Coding theorem § 2. Gauss channel with one degree of freedom § 3. Classical signal on quantum background noise Bibliography

  18. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    Science.gov (United States)

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  19. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    Science.gov (United States)

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  20. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  1. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  2. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  3. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  4. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    , in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded......The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access...

  5. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  6. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  7. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  8. Code de conduite

    International Development Research Centre (IDRC) Digital Library (Canada)

    irocca

    son point de vue, dans un esprit d'accueil et de respect. NOTRE CODE DE CONDUITE. Le CRDI s'engage à adopter un comportement conforme aux normes d'éthique les plus strictes dans toutes ses activités. Le Code de conduite reflète notre mission, notre philosophie en matière d'emploi et les résultats des discussions ...

  9. Open Coding Descriptions

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon PhD

    2016-12-01

    Full Text Available Open coding is a big source of descriptions that must be managed and controlled when doing GT research. The goal of generating a GT is to generate an emergent set of concepts and their properties that fit and work with relevancy to be integrated into a theory. To achieve this goal, the researcher begins his research with open coding, that is coding all his data in every possible way. The consequence of this open coding is a multitude of descriptions for possible concepts that often do not fit in the emerging theory. Thus in this case the researcher ends up with many irrelevant descriptions for concepts that do not apply. To dwell on descriptions for inapplicable concepts ruins the GT theory as it starts. It is hard to stop. Confusion easily sets in. Switching the study to a QDA is a simple rescue. Rigorous focusing on emerging concepts is vital before being lost in open coding descriptions. It is important, no matter how interesting the description may become. Once a core is possible, selective coding can start which will help control against being lost in multiple descriptions.

  10. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  11. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  12. Code blue: seizures.

    Science.gov (United States)

    Hoerth, Matthew T; Drazkowski, Joseph F; Noe, Katherine H; Sirven, Joseph I

    2011-06-01

    Eyewitnesses frequently perceive seizures as life threatening. If an event occurs on the hospital premises, a "code blue" can be called which consumes considerable resources. The purpose of this study was to determine the frequency and characteristics of code blue calls for seizures and seizure mimickers. A retrospective review of a code blue log from 2001 through 2008 identified 50 seizure-like events, representing 5.3% of all codes. Twenty-eight (54%) occurred in inpatients; the other 22 (44%) events involved visitors or employees on the hospital premises. Eighty-six percent of the events were epileptic seizures. Seizure mimickers, particularly psychogenic nonepileptic seizures, were more common in the nonhospitalized group. Only five (17.9%) inpatients had a known diagnosis of epilepsy, compared with 17 (77.3%) of the nonhospitalized patients. This retrospective survey provides insights into how code blues are called on hospitalized versus nonhospitalized patients for seizure-like events. Copyright © 2011. Published by Elsevier Inc.

  13. Error coding simulations

    Science.gov (United States)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  14. Twisted Reed-Solomon Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Puchinger, Sven; Rosenkilde ne Nielsen, Johan

    2017-01-01

    We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code.......We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code....

  15. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  16. Manufacturer Identification Code (MID) - ACE

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  17. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  18. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  19. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  20. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  1. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.

  2. Physical Layer Network Coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Yomo, Hironori; Popovski, Petar

    2013-01-01

    Physical layer network coding (PLNC) has the potential to improve throughput of multi-hop networks. However, most of the works are focused on the simple, three-node model with two-way relaying, not taking into account the fact that there can be other neighboring nodes that can cause/receive inter......Physical layer network coding (PLNC) has the potential to improve throughput of multi-hop networks. However, most of the works are focused on the simple, three-node model with two-way relaying, not taking into account the fact that there can be other neighboring nodes that can cause...

  3. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  4. Securing mobile code.

    Energy Technology Data Exchange (ETDEWEB)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called &apos

  5. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    Science.gov (United States)

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory performance. The formal modeling results demonstrated that adults differed significantly from the 7-year-olds and 10-year-olds on both the prospective component and the retrospective component of the task. The 7-year-olds and 10-year-olds differed only in the ability to recognize prospective memory target events. The prospective memory task imposed a cost to ongoing activities in all three age groups. PMID:20053020

  6. Time-based and event-based prospective memory in autism spectrum disorder: the roles of executive function and theory of mind, and time-estimation.

    Science.gov (United States)

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-07-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched neurotypical comparison children. We found impaired time-based, but undiminished event-based, prospective memory among children with ASD. In the ASD group, time-based prospective memory performance was associated significantly with diminished theory of mind, but not with diminished cognitive flexibility. There was no evidence that time-estimation ability contributed to time-based prospective memory impairment in ASD.

  7. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  8. Ptolemy Coding Style

    Science.gov (United States)

    2014-09-05

    because this would combine Ptolemy II with the GPL’d code and thus encumber Ptolemy II with the GPL. Another GNU license is the GNU Library General...permission on the source.eecs.berkeley.edu repositories, then use your local repository. bash-3.2$ svn co svn+ ssh ://source.eecs.berkeley.edu/chess

  9. Error Correcting Codes

    Indian Academy of Sciences (India)

    The images, which came from Oailleo's flyby of the moon on June 26-27. 1996 are reported to be 20 times better than those obtained from the Voyager. Priti Shankar .... a systematic way. Thus was born a brand new field,which has since been ..... mathematically oriented, compact book on coding, containing a few topics not ...

  10. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  11. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  12. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  13. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 10. Error Correcting Codes How Numbers Protect Themselves. Priti Shankar. Series Article Volume 1 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  14. Video Coding for ESL.

    Science.gov (United States)

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  15. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  16. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Computer Science and. Automation, liSe. Their research addresses various aspects of algebraic and combinatorial coding theory. 1 low Density Parity Check ..... lustrating how the variable Xd is decoded. As mentioned earlier, this algorithm runs iteratively. To start with, in the first iteration, only bits in the first level of the ...

  17. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives...

  18. Student Dress Codes.

    Science.gov (United States)

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  19. Dress Codes and Uniforms.

    Science.gov (United States)

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  20. Dress Codes. Legal Brief.

    Science.gov (United States)

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  1. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    titled 'A Mathematical Theory of Communication' in the Bell Systems Technical Journal in 1948. The paper set up a ... 'existential' result but nota 'constructive' one. The construction of such a code evolved from the work ... several papers on hyperbolic geometry. He shifted to the Department of Pure Mathematics at Calcutta.

  2. Cracking the Codes

    Science.gov (United States)

    Heathcote, Dorothy

    1978-01-01

    Prescribes an attitude that teachers can take to help students "crack the code" of a dramatic work, combining a flexible teaching strategy, the suspension of beliefs or preconceived notions about the work, focusing on the drams's text, and choosing a reading strategy appropriate to the dramatic work. (RL)

  3. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  4. Coded SQUID arrays

    NARCIS (Netherlands)

    Podt, M.; Weenink, J.; Weenink, J.; Flokstra, Jakob; Rogalla, Horst

    2001-01-01

    We report on a superconducting quantum interference device (SQUID) system to read out large arrays of cryogenic detectors. In order to reduce the number of SQUIDs required for an array of these detectors, we used code-division multiplexing. This simplifies the electronics because of a significantly

  5. Reed-Solomon convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Schmale, W

    2005-01-01

    In this paper we will introduce a specific class of cyclic convolutional codes. The construction is based on Reed-Solomon block codes. The algebraic parameters as well as the distance of these codes are determined. This shows that some of these codes are optimal or near optimal.

  6. Causation, constructors and codes.

    Science.gov (United States)

    Hofmeyr, Jan-Hendrik S

    2017-09-13

    Relational biology relies heavily on the enriched understanding of causal entailment that Robert Rosen's formalisation of Aristotle's four causes has made possible, although to date efficient causes and the rehabilitation of final cause have been its main focus. Formal cause has been paid rather scant attention, but, as this paper demonstrates, is crucial to our understanding of many types of processes, not necessarily biological. The graph-theoretic relational diagram of a mapping has played a key role in relational biology, and the first part of the paper is devoted to developing an explicit representation of formal cause in the diagram and how it acts in combination with efficient cause to form a mapping. I then use these representations to show how Von Neumann's universal constructor can be cast into a relational diagram in a way that avoids the logical paradox that Rosen detected in his own representation of the constructor in terms of sets and mappings. One aspect that was absent from both Von Neumann's and Rosen's treatments was the necessity of a code to translate the description (the formal cause) of the automaton to be constructed into the construction process itself. A formal definition of codes in general, and organic codes in particular, allows the relational diagram to be extended so as to capture this translation of formal cause into process. The extended relational diagram is used to exemplify causal entailment in a diverse range of processes, such as enzyme action, construction of automata, communication through the Morse code, and ribosomal polypeptide synthesis through the genetic code. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  8. Coding Theory and Projective Spaces

    Science.gov (United States)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  9. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...

  10. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  11. Coding vs non-coding: Translatability of short ORFs found in putative non-coding transcripts.

    Science.gov (United States)

    Kageyama, Yuji; Kondo, Takefumi; Hashimoto, Yoshiko

    2011-11-01

    Genome analysis has identified a number of putative non-protein-coding transcripts that do not contain ORFs longer than 100 codons. Although evidence strongly suggests that non-coding RNAs are important in a variety of biological phenomena, the discovery of small peptide-coding mRNAs confirms that some transcripts that have been assumed to be non-coding actually have coding potential. Their abundance and importance in biological phenomena makes the sorting of non-coding RNAs from small peptide-coding mRNAs a key issue in functional genomics. However, validating the coding potential of small peptide-coding RNAs is complicated, because their ORF sequences are usually too short for computational analysis. In this review, we discuss computational and experimental methods for validating the translatability of these non-coding RNAs. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  12. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, however, that national views of good governance reflect different political cultures and institutional heritages. Fourteen national codes of conduct are analyzed. The findings suggest that public values converge and that they match model codes from the United Nations and the European Council as well...... as conceptions of good governance from other international organizations. While values converge, they are balanced and communicated differently, and seem to some extent to be translated into the national cultures. The set of global public values derived from this analysis include public interest, regime dignity...

  13. Synthetic histone code.

    Science.gov (United States)

    Fischle, Wolfgang; Mootz, Henning D; Schwarzer, Dirk

    2015-10-01

    Chromatin is the universal template of genetic information in all eukaryotic cells. This complex of DNA and histone proteins not only packages and organizes genomes but also regulates gene expression. A multitude of posttranslational histone modifications and their combinations are thought to constitute a code for directing distinct structural and functional states of chromatin. Methods of protein chemistry, including protein semisynthesis, amber suppression technology, and cysteine bioconjugation, have enabled the generation of so-called designer chromatin containing histones in defined and homogeneous modification states. Several of these approaches have matured from proof-of-concept studies into efficient tools and technologies for studying the biochemistry of chromatin regulation and for interrogating the histone code. We summarize pioneering experiments and recent developments in this exciting field of chemical biology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  15. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  16. Hydra Code Release

    OpenAIRE

    Couchman, H. M. P.; Pearce, F. R.; Thomas, P. A.

    1996-01-01

    Comment: A new version of the AP3M-SPH code, Hydra, is now available as a tar file from the following sites; http://coho.astro.uwo.ca/pub/hydra/hydra.html , http://star-www.maps.susx.ac.uk/~pat/hydra/hydra.html . The release now also contains a cosmological initial conditions generator, documentation, an installation guide and installation tests. A LaTex version of the documentation is included here

  17. Adaptive Hybrid Picture Coding.

    Science.gov (United States)

    1983-02-05

    process, namely displacement or motion detection and estimation. DWSPLACEENT AD MOTION Simply stated, motion is defined to be a time series of spatial...regressive model in that the prediction is made with respect to a time series . That is future values of a time series are to be predicted on...B8 - 90. Robbins, John D., and Netravali, Arun N., "Interframe Telivision Coding Using Movement Compensation," Internation Conference on

  18. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  19. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  20. On Some Ternary LCD Codes

    OpenAIRE

    Darkunde, Nitin S.; Patil, Arunkumar R.

    2018-01-01

    The main aim of this paper is to study $LCD$ codes. Linear code with complementary dual($LCD$) are those codes which have their intersection with their dual code as $\\{0\\}$. In this paper we will give rather alternative proof of Massey's theorem\\cite{8}, which is one of the most important characterization of $LCD$ codes. Let $LCD[n,k]_3$ denote the maximum of possible values of $d$ among $[n,k,d]$ ternary $LCD$ codes. In \\cite{4}, authors have given upper bound on $LCD[n,k]_2$ and extended th...

  1. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  2. Allele coding in genomic evaluation

    DEFF Research Database (Denmark)

    Standen, Ismo; Christensen, Ole Fredslund

    2011-01-01

    this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. \\paragraph*{Results:} Theoretical derivations showed that parameter...... coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being the best. \\paragraph*{Conclusions:} Different allele coding methods lead to the same inference in the marker-based and equivalent models when a fixed...

  3. Polynomial weights and code constructions.

    Science.gov (United States)

    Massey, J. L.; Costello, D. J., Jr.; Justesen, J.

    1973-01-01

    Study of certain polynomials with the 'weight-retaining' property that any linear combination of these polynomials with coefficients in a general finite field has Hamming weight at least as great as that of the minimum-degree polynomial included. This fundamental property is used in applications to Reed-Muller codes, a new class of 'repeated-root' binary cyclic codes, two new classes of binary convolutional codes derived from binary cyclic codes, and two new classes of binary convolutional codes derived from Reed-Solomon codes.

  4. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  5. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  6. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Høholdt, Tom

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K...

  7. New Code Matched Interleaver for Turbo Codes with Short Frames

    Directory of Open Access Journals (Sweden)

    LAZAR, G. A.

    2010-02-01

    Full Text Available Turbo codes are a parallel concatenation of two or more convolutional codes, separated by interleavers, therefore their performance is not influenced just by the constituent encoders, but also by the interleaver. For short frame turbo codes, the selection of a proper interleaver becomes critical. This paper presents a new algorithm of obtaining a code matched interleaver leading to a very high minimum distance and improved performance.

  8. Code Flows : Visualizing Structural Evolution of Source Code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    2008-01-01

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual

  9. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    P. C. Catherine. K. M. S Soyjaudah. Department of Electrical and Electronics Engineering ... in the 1960's, Gallager in his PhD thesis worked on low-density parity-check (LDPC) codes (Gallager 1963). ..... In any case however, it is hoped that the ideas behind TG codes will help in the development of future intelligent coding ...

  10. Code flows : Visualizing structural evolution of source code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual

  11. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    This work proposes a blend of the two technologies, yielding a code that we nicknamed Turbo-Gallager or TG Code. The code has additional “intelligence” compared to its parents. It detects and corrects the so-called “undetected errors” and recovers from individual decoder failure by making use of a network of decoders.

  12. Time-based and event-based prospective memory in autism spectrum disorder: The roles of executive function and theory of mind, and time estimation

    OpenAIRE

    Williams, D.; Boucher, J.; Lind, S. E.; Jarrold, C.

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in\\ud ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an\\ud action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive\\ud correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched\\ud neurotypical comparison children. We found impaired time-based, but und...

  13. Code Generation with Templates

    CERN Document Server

    Arnoldus, Jeroen; Serebrenik, A

    2012-01-01

    Templates are used to generate all kinds of text, including computer code. The last decade, the use of templates gained a lot of popularity due to the increase of dynamic web applications. Templates are a tool for programmers, and implementations of template engines are most times based on practical experience rather than based on a theoretical background. This book reveals the mathematical background of templates and shows interesting findings for improving the practical use of templates. First, a framework to determine the necessary computational power for the template metalanguage is presen

  14. Cinder begin creative coding

    CERN Document Server

    Rijnieks, Krisjanis

    2013-01-01

    Presented in an easy to follow, tutorial-style format, this book will lead you step-by-step through the multi-faceted uses of Cinder.""Cinder: Begin Creative Coding"" is for people who already have experience in programming. It can serve as a transition from a previous background in Processing, Java in general, JavaScript, openFrameworks, C++ in general or ActionScript to the framework covered in this book, namely Cinder. If you like quick and easy to follow tutorials that will let yousee progress in less than an hour - this book is for you. If you are searching for a book that will explain al

  15. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  16. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  17. Authorship Attribution of Source Code

    Science.gov (United States)

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  18. Coded nanoscale self-assembly

    Indian Academy of Sciences (India)

    the number of starting particles. Figure 6. Coded self-assembly results in specific shapes. When the con- stituent particles are coded to only combine in a certain defined rules, it al- ways manages to generate the same shape. The simplest case of linear coding with multiseed option is presented here. in place the resultant ...

  19. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  20. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...

  1. Coding Issues in Grounded Theory

    Science.gov (United States)

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  2. Product Codes for Optical Communication

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    2002-01-01

    Many optical communicaton systems might benefit from forward-error-correction. We present a hard-decision decoding algorithm for the "Block Turbo Codes", suitable for optical communication, which makes this coding-scheme an alternative to Reed-Solomon codes....

  3. Time-Varying Space-Only Codes for Coded MIMO

    CERN Document Server

    Duyck, Dieter; Takawira, Fambirai; Boutros, Joseph J; Moeneclaey, Marc

    2012-01-01

    Multiple antenna (MIMO) devices are widely used to increase reliability and information bit rate. Optimal error rate performance (full diversity and large coding gain), for unknown channel state information at the transmitter and for maximal rate, can be achieved by approximately universal space-time codes, but comes at a price of large detection complexity, infeasible for most practical systems. We propose a new coded modulation paradigm: error-correction outer code with space-only but time-varying precoder (as inner code). We refer to the latter as Ergodic Mutual Information (EMI) code. The EMI code achieves the maximal multiplexing gain and full diversity is proved in terms of the outage probability. Contrary to most of the literature, our work is not based on the elegant but difficult classical algebraic MIMO theory. Instead, the relation between MIMO and parallel channels is exploited. The theoretical proof of full diversity is corroborated by means of numerical simulations for many MIMO scenarios, in te...

  4. Genetic code for sine

    Science.gov (United States)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  5. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  6. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  7. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  8. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  9. Influence of intra-event-based flood regime on sediment flow behavior from a typical agro-catchment of the Chinese Loess Plateau

    Science.gov (United States)

    Zhang, Le-Tao; Li, Zhan-Bin; Wang, He; Xiao, Jun-Bo

    2016-07-01

    The pluvial erosion process is significantly affected by tempo-spatial patterns of flood flows. However, despite their importance, only a few studies have investigated the sediment flow behavior that is driven by different flood regimes. The study aims to investigate the effect of intra-event-based flood regimes on the dynamics of sediment exports at Tuanshangou catchment, a typical agricultural catchment (unmanaged) in the hilly loess region on the Chinese Loess Plateau. Measurements of 193 flood events and 158 sediment-producing events were collected from Tuanshangou station between 1961 and 1969. The combined methods of hierarchical clustering approach, discriminant analysis and One-Way ANOVA were used to classify the flood events in terms of their event-based flood characteristics, including flood duration, peak discharge, and event flood runoff depth. The 193 flood events were classified into five regimes, and the mean statistical features of each regime significantly differed. Regime A includes flood events with the shortest duration (76 min), minimum flood crest (0.045 m s-1), least runoff depth (0.2 mm), and highest frequency. Regime B includes flood events with a medium duration (274 min), medium flood crest (0.206 m s-1), and minor runoff depth (0.7 mm). Regime C includes flood events with the longest duration (822 min), medium flood crest (0.236 m s-1), and medium runoff depth (1.7 mm). Regime D includes flood events with a medium duration (239 min), large flood crest (4.21 m s-1), and large runoff depth (10 mm). Regime E includes flood events with a medium duration (304 min), maximum flood crest (8.62 m s-1), and largest runoff depth (25.9 mm). The sediment yield by different flood regimes is ranked as follows: Regime E > Regime D > Regime B > Regime C > Regime A. In terms of event-based average and maximum suspended sediment concentration, these regimes are ordered as follows: Regime E > Regime D > Regime C > Regime B > Regime A. Regimes D and E

  10. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  11. Commentary: Is the glass half empty? Code blue training in the modern era.

    Science.gov (United States)

    Yang, Julius; Howell, Michael D

    2011-06-01

    Skilled management of cardiopulmonary resuscitation, or responding to a "code blue," is widely considered an important training objective during internal medicine residency. Gaining proficiency in managing a code blue typically depends on event-based experiential learning. In this issue of Academic Medicine, Mickelsen and colleagues report their use of schedule-based stochastic simulation estimates matched with observed code blue data to model the number of annual opportunities a first-year resident has to participate in code blue events. Their data offer compelling evidence that trainees in 2008 had much less opportunity (83% less) to participate in code blue events than did their predecessors in 2002. Mickelsen and coinvestigators speculate that this reduction could be attributable to quality improvement initiatives that may have reduced the total number of code blue situations, as well as to duty hours restrictions that reduced the residents' overall availability to participate. The authors of this commentary discuss the general influence of secular trends on educational needs, and they describe possible strategies to compensate for less "in-the-field" exposure by maximizing the "learning yield per event" and using simulation training methods. Finally, the authors consider the question of whether code blue training remains an appropriate goal for general medicine trainees in the face of evolving trends in health care systems.

  12. Peripheral coding of taste

    Science.gov (United States)

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  13. Code des baux 2018

    CERN Document Server

    Vial-Pedroletti, Béatrice; Kendérian, Fabien; Chavance, Emmanuelle; Coutan-Lapalus, Christelle

    2017-01-01

    Le code des baux 2018 vous offre un contenu extrêmement pratique, fiable et à jour au 1er août 2017. Cette 16e édition intègre notamment : le décret du 27 juillet 2017 relatif à l’évolution de certains loyers dans le cadre d’une nouvelle location ou d’un renouvellement de bail, pris en application de l’article 18 de la loi n° 89-462 du 6 juillet 1989 ; la loi du 27 janvier 2017 relative à l’égalité et à la citoyenneté ; la loi du 9 décembre 2016 relative à la transparence, à la lutte contre la corruption et à la modernisation de la vie économique ; la loi du 18 novembre 2016 de modernisation de la justice du xxie siècle

  14. [Neural codes for perception].

    Science.gov (United States)

    Romo, R; Salinas, E; Hernández, A; Zainos, A; Lemus, L; de Lafuente, V; Luna, R

    This article describes experiments designed to show the neural codes associated with the perception and processing of tactile information. The results of these experiments have shown the neural activity correlated with tactile perception. The neurones of the primary somatosensory cortex (S1) represent the physical attributes of tactile perception. We found that these representations correlated with tactile perception. By means of intracortical microstimulation we demonstrated the causal relationship between S1 activity and tactile perception. In the motor areas of the frontal lobe is to be found the connection between sensorial and motor representation whilst decisions are being taken. S1 generates neural representations of the somatosensory stimuli which seen to be sufficient for tactile perception. These neural representations are subsequently processed by central areas to S1 and seem useful in perception, memory and decision making.

  15. Code-labelling

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    in programming education collected in an Action Research cycle. The results support the use of a structural approach to teaching programming to this target audience; particularly, the translation-grammar method seems to integrate well with programming education. The paper also explores the potential underlying......The code-labelling exercise is an attempt to apply natural language education techniques for solving the challenge of teaching introductory programming to non-STEM novices in higher education. This paper presents findings from a study exploring the use of natural language teaching techniques...... reasons. It seems the exercise invokes an assimilation of student's existing cognitive schemata and supports a deep-learning experience. The exercise is an invitation to other teachers to create further iterations to improve their own teaching. It also seeks to enrich the portfolio of teaching activities...

  16. Transionospheric Propagation Code (TIPC)

    Science.gov (United States)

    Roussel-Dupre, Robert; Kelley, Thomas A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of VHF signals following propagation through the ionosphere. The code is written in FORTRAN 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, delta times of arrival (DTOA) study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of DTOAs vs TECs for a specified pair of receivers.

  17. Galois LCD Codes over Finite Fields

    OpenAIRE

    Liu, Xiusheng; Fan, Yun; Liu, Hualu

    2017-01-01

    In this paper, we study the complementary dual codes in more general setting (which are called Galois LCD codes) by a uniform method. A necessary and sufficient condition for linear codes to be Galois LCD codes is determined, and constacyclic codes to be Galois LCD codes are characterized. Some illustrative examples which constacyclic codes are Galois LCD MDS codes are provided as well. In particular, we study Hermitian LCD constacyclic codes. Finally, we present a construction of a class of ...

  18. Quantum Quasi-Cyclic LDPC Codes

    OpenAIRE

    Hagiwara, Manabu; Imai, Hideki

    2007-01-01

    In this paper, a construction of a pair of "regular" quasi-cyclic LDPC codes as ingredient codes for a quantum error-correcting code is proposed. That is, we find quantum regular LDPC codes with various weight distributions. Furthermore our proposed codes have lots of variations for length, code rate. These codes are obtained by a descrete mathematical characterization for model matrices of quasi-cyclic LDPC codes. Our proposed codes achieve a bounded distance decoding (BDD) bound, or known a...

  19. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  20. Benchmarking Tokamak edge modelling codes

    Science.gov (United States)

    Contributors To The Efda-Jet Work Programme; Coster, D. P.; Bonnin, X.; Corrigan, G.; Kirnev, G. S.; Matthews, G.; Spence, J.; Contributors to the EFDA-JET work programme

    2005-03-01

    Tokamak edge modelling codes are in widespread use to interpret and understand existing experiments, and to make predictions for future machines. Little direct benchmarking has been done between the codes, and the users of the codes have tended to concentrate on different experimental machines. An important validation step is to compare the codes for identical scenarios. In this paper, two of the major edge codes, SOLPS (B2.5-Eirene) and EDGE2D-NIMBUS are benchmarked against each other. A set of boundary conditions, transport coefficients, etc. for a JET plasma were chosen, and the two codes were run on the same grid. Initially, large differences were seen in the resulting plasmas. These differences were traced to differing physics assumptions with respect to the parallel heat flux limits. Once these were switched off in SOLPS, or implemented and switched on in EDGE2D-NIMBUS, the remaining differences were small.

  1. Event-based prospective memory performance during subacute recovery following moderate to severe traumatic brain injury in children: Effects of monetary incentives.

    Science.gov (United States)

    McCauley, Stephen R; Pedroza, Claudia; Chapman, Sandra B; Cook, Lori G; Hotz, Gillian; Vásquez, Ana C; Levin, Harvey S

    2010-03-01

    There are very few studies investigating remediation of event-based prospective memory (EB-PM) impairments following traumatic brain injury (TBI). To address this, we used 2 levels of motivational enhancement (dollars vs. pennies) to improve EB-PM in children with moderate to severe TBI in the subacute recovery phase. Children with orthopedic injuries (OI; n = 61), moderate (n = 28), or severe (n = 30) TBI were compared. Significant effects included Group x Motivation Condition (F(2, 115) = 3.73, p children (p children with moderate, but not severe, TBI. Other strategies to improve EB-PM in these children at a similar point in recovery remain to be identified and evaluated.

  2. Patterns of cortical thinning in relation to event-based prospective memory performance three months after moderate to severe traumatic brain injury in children.

    Science.gov (United States)

    McCauley, Stephen R; Wilde, Elisabeth A; Merkley, Tricia L; Schnelle, Kathleen P; Bigler, Erin D; Hunter, Jill V; Chu, Zili; Vásquez, Ana C; Levin, Harvey S

    2010-01-01

    While event-based prospective memory (EB-PM) tasks are a familiar part of daily life for children, currently no data exists concerning the relation between EB-PM performance and brain volumetrics after traumatic brain injury (TBI). This study investigated EB-PM in children (7 to 17 years) with moderate to severe TBI or orthopedic injuries. Participants performed an EB-PM task and concurrently underwent neuroimaging at three months postinjury. Surface reconstruction and cortical thickness analysis were performed using FreeSurfer software. Cortical thickness was significantly correlated with EB-PM (adjusting for age). Significant thinning in the left (dorsolateral and inferior prefrontal cortex, anterior and posterior cingulate, temporal lobe, fusiform, and parahippocampal gyri), and right hemispheres (dorsolateral, inferior, and medial prefrontal cortex, cingulate, and temporal lobe) correlated positively and significantly with EB-PM performance; findings are comparable to those of functional neuroimaging and lesion studies of EB-PM.

  3. Low complexity hevc intra coding

    OpenAIRE

    Ruiz Coll, José Damián

    2016-01-01

    Over the last few decades, much research has focused on the development and optimization of video codecs for media distribution to end-users via the Internet, broadcasts or mobile networks, but also for videoconferencing and for the recording on optical disks for media distribution. Most of the video coding standards for delivery are characterized by using a high efficiency hybrid schema, based on inter-prediction coding for temporal picture decorrelation, and intra-prediction coding for spat...

  4. IRIG Serial Time Code Formats

    Science.gov (United States)

    2016-08-01

    and G. It should be noted that this standard reflects the present state of the art in serial time code formatting and is not intended to constrain...separation for visual resolution. The LSB occurs first except for the fractional seconds subword that follows the day-of-year subword. The BCD TOY code...and P6 to complete the BCD time code word. An index marker occurs between the decimal digits in each subword to provide separation for visual

  5. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  6. Indices for Testing Neural Codes

    OpenAIRE

    Jonathan D. Victor; Nirenberg, Sheila

    2008-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is ...

  7. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  8. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  9. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  10. The FLUKA code: an overview

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F [University of Pavia and INFN (Italy); Battistoni, G [University of Milan and INFN (Italy); Campanella, M; Carboni, M; Cerutti, F [University of Milan and INFN (Italy); Empl, A [University of Houston, Houston (United States); Fasso, A [SLAC, Stanford (United States); Ferrari, A [CERN, CH-1211 Geneva (Switzerland); Gadioli, E [University of Milan and INFN (Italy); Garzelli, M V [University of Milan and INFN (Italy); Lantz, M [University of Milan and INFN (Italy); Liotta, M [University of Pavia and INFN (Italy); Mairani, A [University of Pavia and INFN (Italy); Mostacci, A [Laboratori Nazionali di Frascati, INFN (Italy); Muraro, S [University of Milan and INFN (Italy); Ottolenghi, A [University of Pavia and INFN (Italy); Pelliccioni, M [Laboratori Nazionali di Frascati, INFN (Italy); Pinsky, L [University of Houston, Houston (United States); Ranft, J [Siegen University, Siegen (Germany); Roesler, S [CERN, CH-1211 Geneva (Switzerland); Sala, P R [University of Milan and INFN (Italy); Scannicchio, D [University of Pavia and INFN (Italy); Trovati, S [University of Pavia and INFN (Italy); Villari, R; Wilson, T [Johnson Space Center, NASA (United States); Zapp, N [Johnson Space Center, NASA (United States); Vlachoudis, V [CERN, CH-1211 Geneva (Switzerland)

    2006-05-15

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  11. The FLUKA Code: an Overview

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani,; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U.

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  12. Understanding perception through neural "codes".

    Science.gov (United States)

    Freeman, Walter J

    2011-07-01

    A major challenge for cognitive scientists is to deduce and explain the neural mechanisms of the rapid transposition between stimulus energy and recalled memory-between the specific (sensation) and the generic (perception)-in both material and mental aspects. Researchers are attempting three explanations in terms of neural codes. The microscopic code: cellular neurobiologists correlate stimulus properties with the rates and frequencies of trains of action potentials induced by stimuli and carried by topologically organized axons. The mesoscopic code: cognitive scientists formulate symbolic codes in trains of action potentials from feature-detector neurons of phonemes, lines, odorants, vibrations, faces, etc., that object-detector neurons bind into representations of stimuli. The macroscopic code: neurodynamicists extract neural correlates of stimuli and associated behaviors in spatial patterns of oscillatory fields of dendritic activity, which self-organize and evolve on trajectories through high-dimensional brain state space. This multivariate code is expressed in landscapes of chaotic attractors. Unlike other scientific codes, such as DNA and the periodic table, these neural codes have no alphabet or syntax. They are epistemological metaphors that experimentalists need to measure neural activity and engineers need to model brain functions. My aim is to describe the main properties of the macroscopic code and the grand challenge it poses: how do very large patterns of textured synchronized oscillations form in cortex so quickly? © 2010 IEEE

  13. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  14. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  15. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  16. The FLUKA code: an overview

    Science.gov (United States)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fassò, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.; Zapp, N.; Vlachoudis, V.

    2006-05-01

    FLUKA is a multipurpose MonteCarlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  17. Golay and other box codes

    Science.gov (United States)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  18. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...... are optimal or best known for their parameters. In chapter five we study some graph codes with Reed–Solomon component codes. The underlying graph is well known and widely used for its good characteristics. This helps us to compute the dimension of the graph codes. We also introduce a combinatorial concept...... related to the iterative encoding of graph codes with MDS component code. The last chapter deals with affine Grassmann codes and Grassmann codes. We begin with some previously known codes and prove that they are also Tanner codes of the incidence graph of the point–line partial geometry...

  19. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  20. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  1. Civil Code, 11 December 1987.

    Science.gov (United States)

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  2. Error coding simulations in C

    Science.gov (United States)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  3. Asymmetric Quantum Codes on Toric Surfaces

    DEFF Research Database (Denmark)

    Hansen, Johan P.

    2017-01-01

    Asymmetric quantum error-correcting codes are quantum codes defined over biased quantum channels: qubit-flip and phase-shift errors may have equal or different probabilities. The code construction is the Calderbank-Shor-Steane construction based on two linear codes. We present families of toric...... surfaces, toric codes and associated asymmetric quantum error-correcting codes....

  4. Moving code - Sharing geoprocessing logic on the Web

    Science.gov (United States)

    Müller, Matthias; Bernard, Lars; Kadner, Daniel

    2013-09-01

    Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.

  5. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  6. Code & order in polygonal billiards

    OpenAIRE

    Bobok, Jozef; Troubetzkoy, Serge

    2011-01-01

    Two polygons $P,Q$ are code equivalent if there are billiard orbits $u,v$ which hit the same sequence of sides and such that the projections of the orbits are dense in the boundaries $\\partial P, \\partial Q$. Our main results show when code equivalent polygons have the same angles, resp. are similar, resp. affinely similar.

  7. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  8. Grassmann codes and Schubert unions

    DEFF Research Database (Denmark)

    Hansen, Johan Peder; Johnsen, Trygve; Ranestad, Kristian

    2009-01-01

    We study subsets of Grassmann varieties over a field , such that these subsets are unions of Schubert cycles, with respect to a fixed flag. We study such sets in detail, and give applications to coding theory, in particular for Grassmann codes. For much is known about such Schubert unions with a ...

  9. NETWORK CODING BY BEAM FORMING

    DEFF Research Database (Denmark)

    2013-01-01

    Network coding by beam forming in networks, for example, in single frequency networks, can provide aid in increasing spectral efficiency. When network coding by beam forming and user cooperation are combined, spectral efficiency gains may be achieved. According to certain embodiments, a method...

  10. Code breaking in the pacific

    CERN Document Server

    Donovan, Peter

    2014-01-01

    Covers the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945 Describes, explains and analyzes the code breaking techniques developed during the war in the Pacific Exposes the blunders (in code construction and use) made by the Japanese Navy that led to significant US Naval victories

  11. Squares of Random Linear Codes

    DEFF Research Database (Denmark)

    Cascudo Pueyo, Ignacio; Cramer, Ronald; Mirandola, Diego

    2015-01-01

    Given a linear code $C$, one can define the $d$-th power of $C$ as the span of all componentwise products of $d$ elements of $C$. A power of $C$ may quickly fill the whole space. Our purpose is to answer the following question: does the square of a code ``typically'' fill the whole space? We give...

  12. Interleaver Design for Turbo Coding

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl; Zyablov, Viktor

    1997-01-01

    By a combination of construction and random search based on a careful analysis of the low weight words and the distance properties of the component codes, it is possible to find interleavers for turbo coding with a high minimum distance. We have designed a block interleaver with permutations...

  13. Flow Analysis of Code Customizations

    DEFF Research Database (Denmark)

    Hessellund, Anders; Sestoft, Peter

    2008-01-01

    Inconsistency between metadata and code customizations is a major concern in modern, configurable enterprise systems. The increasing reliance on metadata, in the form of XML files, and code customizations, in the form of Java files, has led to a hybrid development platform. The expected consisten...

  14. Recommendations for ECG diagnostic coding

    NARCIS (Netherlands)

    Bonner, R.E.; Caceres, C.A.; Cuddy, T.E.; Meijler, F.L.; Milliken, J.A.; Rautaharju, P.M.; Robles de Medina, E.O.; Willems, J.L.; Wolf, H.K.; Working Group 'Diagnostic Codes'

    1978-01-01

    The Oxford dictionary defines code as "a body of laws so related to each other as to avoid inconsistency and overlapping". It is obvious that natural language with its high degree of ambiguity does not qualify as a code in the sense of this definition. Everyday experiences provide ample evidence

  15. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  16. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  17. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  18. Tristan code and its application

    Science.gov (United States)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  19. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  20. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  1. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  2. Orthogonality of binary codes derived from Reed-Solomon codes

    Science.gov (United States)

    Retter, Charles T.

    1991-07-01

    A simple method is developed for determining the orthogonality of binary codes derived from Reed-Solomon codes and other cyclic codes of length (2 exp m) - 1 over GF(2 exp m) for m bits. Depending on the spectra of the codes, it is sufficient to test a small number of single-frequency pairs for orthogonality, and a pair of bases may be tested in each case simply by summing the appropriate powers of elements of the dual bases. This simple test can be used to find self-orthogonal codes. For even values of m, the author presents a technique that can be used to choose a basis that produces a self-orthogonal, doubly-even code in certain cases, particularly when m is highly composite. If m is a power of 2, this technique can be used to find self-dual bases for GF(2 exp m). Although the primary emphasis is on testing for self orthogonality, the fundamental theorems presented apply also to the orthogonality of two different codes.

  3. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  4. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  5. The Flutter Shutter Code Calculator

    Directory of Open Access Journals (Sweden)

    Yohann Tendero

    2015-08-01

    Full Text Available The goal of the flutter shutter is to make uniform motion blur invertible, by a"fluttering" shutter that opens and closes on a sequence of well chosen sub-intervals of the exposure time interval. In other words, the photon flux is modulated according to a well chosen sequence calledflutter shutter code. This article provides a numerical method that computes optimal flutter shutter codes in terms of mean square error (MSE. We assume that the observed objects follow a known (or learned random velocity distribution. In this paper, Gaussian and uniform velocity distributions are considered. Snapshots are also optimized taking the velocity distribution into account. For each velocity distribution, the gain of the optimal flutter shutter code with respectto the optimal snapshot in terms of MSE is computed. This symmetric optimization of theflutter shutter and of the snapshot allows to compare on an equal footing both solutions, i.e. camera designs. Optimal flutter shutter codes are demonstrated to improve substantially the MSE compared to classic (patented or not codes. A numerical method that permits to perform a reverse engineering of any existing (patented or not flutter shutter codes is also describedand an implementation is given. In this case we give the underlying velocity distribution fromwhich a given optimal flutter shutter code comes from. The combination of these two numerical methods furnishes a comprehensive study of the optimization of a flutter shutter that includes a forward and a backward numerical solution.

  6. Surface code implementation of block code state distillation

    Science.gov (United States)

    Fowler, Austin G.; Devitt, Simon J.; Jones, Cody

    2013-01-01

    State distillation is the process of taking a number of imperfect copies of a particular quantum state and producing fewer better copies. Until recently, the lowest overhead method of distilling states produced a single improved |A〉 state given 15 input copies. New block code state distillation methods can produce k improved |A〉 states given 3k + 8 input copies, potentially significantly reducing the overhead associated with state distillation. We construct an explicit surface code implementation of block code state distillation and quantitatively compare the overhead of this approach to the old. We find that, using the best available techniques, for parameters of practical interest, block code state distillation does not always lead to lower overhead, and, when it does, the overhead reduction is typically less than a factor of three. PMID:23736868

  7. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  8. Lossless Coding with Generalised Criteria

    CERN Document Server

    Charalambous, Charalambos D; Rezaei, Farzad

    2011-01-01

    This paper presents prefix codes which minimize various criteria constructed as a convex combination of maximum codeword length and average codeword length or maximum redundancy and average redundancy, including a convex combination of the average of an exponential function of the codeword length and the average redundancy. This framework encompasses as a special case several criteria previously investigated in the literature, while relations to universal coding is discussed. The coding algorithm derived is parametric resulting in re-adjusting the initial source probabilities via a weighted probability vector according to a merging rule. The level of desirable merging has implication in applications where the maximum codeword length is bounded.

  9. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  10. Network Coding Fundamentals and Applications

    CERN Document Server

    Medard, Muriel

    2011-01-01

    Network coding is a field of information and coding theory and is a method of attaining maximum information flow in a network. This book is an ideal introduction for the communications and network engineer, working in research and development, who needs an intuitive introduction to network coding and to the increased performance and reliability it offers in many applications. This book is an ideal introduction for the research and development communications and network engineer who needs an intuitive introduction to the theory and wishes to understand the increased performance and reliabil

  11. Writing the Live Coding Book

    DEFF Research Database (Denmark)

    Blackwell, Alan; Cox, Geoff; Lee, Sang Wong

    2016-01-01

    This paper is a speculation on the relationship between coding and writing, and the ways in which technical innovations and capabilities enable us to rethink each in terms of the other. As a case study, we draw on recent experiences of preparing a book on live coding, which integrates a wide range...... of personal, historical, technical and critical perspectives. This book project has been both experimental and reflective, in a manner that allows us to draw on critical understanding of both code and writing, and point to the potential for new practices in the future....

  12. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  13. i-Review: Sharing Code

    Directory of Open Access Journals (Sweden)

    Jonas Kubilius

    2014-02-01

    Full Text Available Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF. GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  14. Neural Decoder for Topological Codes

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  15. How crucial is it to account for the antecedent moisture conditions in flood forecasting? Comparison of event-based and continuous approaches on 178 catchments

    Directory of Open Access Journals (Sweden)

    L. Berthet

    2009-06-01

    Full Text Available This paper compares event-based and continuous hydrological modelling approaches for real-time forecasting of river flows. Both approaches are compared using a lumped hydrologic model (whose structure includes a soil moisture accounting (SMA store and a routing store on a data set of 178 French catchments. The main focus of this study was to investigate the actual impact of soil moisture initial conditions on the performance of flood forecasting models and the possible compensations with updating techniques. The rainfall-runoff model assimilation technique we used does not impact the SMA component of the model but only its routing part. Tests were made by running the SMA store continuously or on event basis, everything else being equal. The results show that the continuous approach remains the reference to ensure good forecasting performances. We show, however, that the possibility to assimilate the last observed flow considerably reduces the differences in performance. Last, we present a robust alternative to initialize the SMA store where continuous approaches are impossible because of data availability problems.

  16. Working Memory Load and Reminder Effect on Event-Based Prospective Memory of High- and Low-Achieving Students in Math.

    Science.gov (United States)

    Chen, Youzhen; Lian, Rong; Yang, Lixian; Liu, Jianrong; Meng, Yingfang

    The effects of working memory (WM) demand and reminders on an event-based prospective memory (PM) task were compared between students with low and high achievement in math. WM load (1- and 2-back tasks) was manipulated as a within-subject factor and reminder (with or without reminder) as a between-subject factor. Results showed that high-achieving students outperformed low-achieving students on all PM and n-back tasks. Use of a reminder improved PM performance and thus reduced prospective interference; the performance of ongoing tasks also improved for all students. Both PM and n-back performances in low WM load were better than in high WM load. High WM load had more influence on low-achieving students than on high-achieving students. Results suggest that low-achieving students in math were weak at PM and influenced more by high WM load. Thus, it is important to train these students to set up an obvious reminder for their PM and improve their WM.

  17. Detection of prospective memory deficits in mild cognitive impairment of suspected Alzheimer's disease etiology using a novel event-based prospective memory task.

    LENUS (Irish Health Repository)

    Blanco-Campal, Alberto

    2009-01-01

    We investigated the relative discriminatory efficacy of an event-based prospective memory (PM) task, in which specificity of the instructions and perceptual salience of the PM cue were manipulated, compared with two widely used retrospective memory (RM) tests (Rivermead Paragraph Recall Test and CERAD-Word List Test), when detecting mild cognitive impairment of suspected Alzheimer\\'s disease etiology (MCI-AD) (N = 19) from normal controls (NC) (N = 21). Statistical analyses showed high discriminatory capacity of the PM task for detecting MCI-AD. The Non-Specific-Non-Salient condition proved particularly useful in detecting MCI-AD, possibly reflecting the difficulty of the task, requiring more strategic attentional resources to monitor for the PM cue. With a cutoff score of <4\\/10, the Non-Specific-Non-Salient condition achieved a sensitivity = 84%, and a specificity = 95%, superior to the most discriminative RM test used (CERAD-Total Learning: sensitivity = 83%; specificity = 76%). Results suggest that PM is an early sign of memory failure in MCI-AD and may be a more pronounced deficit than retrospective failure, probably reflecting the greater self-initiated retrieval demands involved in the PM task used. Limitations include the relatively small sample size, and the use of a convenience sample (i.e. memory clinic attenders and healthy active volunteers), reducing the generalizability of the results, which should be regarded as preliminary. (JINS, 2009, 15, 154-159.).

  18. Electronic Code of Federal Regulations

    Data.gov (United States)

    National Archives and Records Administration — The Electronic Code of Federal Regulations (e-CFR) is the codification of the general and permanent rules published in the Federal Register by the executive...

  19. The Serializability of Network Codes

    CERN Document Server

    Blasiak, Anna

    2010-01-01

    Network coding theory studies the transmission of information in networks whose vertices may perform nontrivial encoding and decoding operations on data as it passes through the network. The main approach to deciding the feasibility of network coding problems aims to reduce the problem to optimization over a polytope of entropic vectors subject to constraints imposed by the network structure. In the case of directed acyclic graphs, these constraints are completely understood, but for general graphs the problem of enumerating them remains open: it is not known how to classify the constraints implied by a property that we call serializability, which refers to the absence of paradoxical circular dependencies in a network code. In this work we initiate the first systematic study of the constraints imposed on a network code by serializability. We find that serializability cannot be detected solely by evaluating the Shannon entropy of edge sets in the graph, but nevertheless, we give a polynomial-time algorithm tha...

  20. Tree Coding of Bilevel Images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1998-01-01

    probabilities to an arithmetic coder. The conditional probabilities are estimated from co-occurrence statistics of past pixels, the statistics are stored in a tree. By organizing the code length calculations properly, a vast number of possible models (trees) reflecting different pixel orderings can...... be investigated within reasonable time prior to generating the code. A number of general-purpose coders are constructed according to this principle. Rissanen's (1989) one-pass algorithm, context, is presented in two modified versions. The baseline is proven to be a universal coder. The faster version, which...... is one order of magnitude slower than JBIG, obtains excellent and highly robust compression performance. A multipass free tree coding scheme produces superior compression results for all test images. A multipass free template coding scheme produces significantly better results than JBIG for difficult...

  1. FLYCHK Collisional-Radiative Code

    Science.gov (United States)

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  2. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  3. The Aesthetics of Code Switching

    National Research Council Canada - National Science Library

    Ali Mohammadi Asiabadi

    2010-01-01

    ... explained. However, the aesthetic aspects of these figures can be shown through several theories that discuss code switching considering the fact that some of the theories are commonly used in literature and literary criticism...

  4. Interlibrary Loan Codes and Guidelines.

    Science.gov (United States)

    RQ, 1980

    1980-01-01

    Presents a model interlibrary loan policy for regional, state, local, and other special groups of libraries; the 1980 national interlibrary loan code; and the 1978 procedural guidelines for international lending. (FM)

  5. Zip Codes - MDC_WCSZipcode

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — The WCSZipcode polygon feature class was created by Miami-Dade Enterprise Technology Department to be used in the WCS batch jobs to assign the actual zip code of...

  6. The Aesthetics of Code Switching

    National Research Council Canada - National Science Library

    Ali Mohammadi Asiabadi

    2010-01-01

    .... However, the aesthetic aspects of these figures can be shown through several theories that discuss code switching considering the fact that some of the theories are commonly used in literature and literary criticism...

  7. Adaptive decoding of convolutional codes

    Directory of Open Access Journals (Sweden)

    K. Hueske

    2007-06-01

    Full Text Available Convolutional codes, which are frequently used as error correction codes in digital transmission systems, are generally decoded using the Viterbi Decoder. On the one hand the Viterbi Decoder is an optimum maximum likelihood decoder, i.e. the most probable transmitted code sequence is obtained. On the other hand the mathematical complexity of the algorithm only depends on the used code, not on the number of transmission errors. To reduce the complexity of the decoding process for good transmission conditions, an alternative syndrome based decoder is presented. The reduction of complexity is realized by two different approaches, the syndrome zero sequence deactivation and the path metric equalization. The two approaches enable an easy adaptation of the decoding complexity for different transmission conditions, which results in a trade-off between decoding complexity and error correction performance.

  8. Allegheny County Zip Code Boundaries

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset demarcates the zip code boundaries that lie within Allegheny County. These are not clipped to the Allgeheny County boundary. If viewing this...

  9. Universal codes of the natural numbers

    OpenAIRE

    Filmus, Yuval

    2013-01-01

    A code of the natural numbers is a uniquely-decodable binary code of the natural numbers with non-decreasing codeword lengths, which satisfies Kraft's inequality tightly. We define a natural partial order on the set of codes, and show how to construct effectively a code better than a given sequence of codes, in a certain precise sense. As an application, we prove that the existence of a scale of codes (a well-ordered set of codes which contains a code better than any given code) is independen...

  10. Beam-dynamics codes used at DARHT

    Energy Technology Data Exchange (ETDEWEB)

    Ekdahl, Jr., Carl August [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    Several beam simulation codes are used to help gain a better understanding of beam dynamics in the DARHT LIAs. The most notable of these fall into the following categories: for beam production – Tricomp Trak orbit tracking code, LSP Particle in cell (PIC) code, for beam transport and acceleration – XTR static envelope and centroid code, LAMDA time-resolved envelope and centroid code, LSP-Slice PIC code, for coasting-beam transport to target – LAMDA time-resolved envelope code, LSP-Slice PIC code. These codes are also being used to inform the design of Scorpius.

  11. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  12. UNIX code management and distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process.

  13. Verification of ONED90 code

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Ki Bog; Zee, Sung Kyun; Lee, Chang Ho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    ONED90 developed by KAERI is a 1-dimensional 2-group diffusion theory code. For nuclear design and reactor simulation, the usage of ONED90 encompasses core follow calculation, load follow calculation, plant power control simulation, xenon oscillation simulation and control rod maneuvering, etc. In order to verify the validity of ONED90 code, two well-known benchmark problems are solved by ONED90 shows very similar result to reference solution. (Author) 11 refs., 5 figs., 13 tabs.

  14. Training course on code implementation.

    Science.gov (United States)

    Allain, A; De Arango, R

    1992-01-01

    The International Baby Food Action Network (IBFAN) is a coalition of over 40 citizen groups in 70 countries. IBFAN monitors the progress worldwide of the implementation of the International Code of Marketing of Breastmilk Substitutes. The Code is intended to regulate the advertising and promotional techniques used to sell infant formula. The 1991 IBFAN report shows that 75 countries have taken some action to implement the International Code. During 1992, the IBFAN Code Documentation Center in Malaysia conducted 2 training courses to help countries draft legislation to implement and monitor compliance with the International Code. In April, government officials from 19 Asian and African countries attended the first course in Malaysia; the second course was conducted in Spanish in Guatemala and attended by officials from 15 Latin American and Caribbean countries. The resource people included representatives from NGOs in Africa, Asia, Latin America, Europe and North America with experience in Code implementation and monitoring at the national level. The main purpose of each course was to train government officials to use the International Code as a starting point for national legislation to protect breastfeeding. Participants reviewed recent information on lactation management, the advantages of breastfeeding, current trends in breastfeeding and the marketing practices of infant formula manufacturers. The participants studied the terminology contained in the International Code and terminology used by infant formula manufacturers to include breastmilk supplements such as follow-on formulas and cereal-based baby foods. Relevant World Health Assembly resolutions such as the one adopted in 1986 on the need to ban free and low-cost supplies to hospitals were examined. The legal aspects of the current Baby Friendly Hospital Initiative (BFHI) and the progress in the 12 BFHI test countries concerning the elimination of supplies were also examined. International Labor

  15. Continuous speech recognition with sparse coding

    CSIR Research Space (South Africa)

    Smit, WJ

    2009-04-01

    Full Text Available Sparse coding is an efficient way of coding information. In a sparse code most of the code elements are zero; very few are active. Sparse codes are intended to correspond to the spike trains with which biological neurons communicate. In this article...

  16. Facilitating Internet-Scale Code Retrieval

    Science.gov (United States)

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  17. Error-erasure decoding of product codes.

    Science.gov (United States)

    Wainberg, S.

    1972-01-01

    Two error-erasure decoding algorithms for product codes that correct all the error-erasure patterns guaranteed correctable by the minimum Hamming distance of the product code are given. The first algorithm works when at least one of the component codes is majority-logic decodable. The second algorithm works for any product code. Both algorithms use the decoders of the component codes.

  18. Sensorimotor transformation via sparse coding

    Science.gov (United States)

    Takiyama, Ken

    2015-01-01

    Sensorimotor transformation is indispensable to the accurate motion of the human body in daily life. For instance, when we grasp an object, the distance from our hands to an object needs to be calculated by integrating multisensory inputs, and our motor system needs to appropriately activate the arm and hand muscles to minimize the distance. The sensorimotor transformation is implemented in our neural systems, and recent advances in measurement techniques have revealed an important property of neural systems: a small percentage of neurons exhibits extensive activity while a large percentage shows little activity, i.e., sparse coding. However, we do not yet know the functional role of sparse coding in sensorimotor transformation. In this paper, I show that sparse coding enables complete and robust learning in sensorimotor transformation. In general, if a neural network is trained to maximize the performance on training data, the network shows poor performance on test data. Nevertheless, sparse coding renders compatible the performance of the network on both training and test data. Furthermore, sparse coding can reproduce reported neural activities. Thus, I conclude that sparse coding is necessary and a biologically plausible factor in sensorimotor transformation. PMID:25923980

  19. Continuous Non-malleable Codes

    DEFF Research Database (Denmark)

    Faust, Sebastian; Mukherjee, Pratyay; Nielsen, Jesper Buus

    2014-01-01

    Non-malleable codes are a natural relaxation of error correcting/ detecting codes that have useful applications in the context of tamper resilient cryptography. Informally, a code is non-malleable if an adversary trying to tamper with an encoding of a given message can only leave it unchanged......-malleable codes where the adversary only is allowed to tamper a single time with an encoding. We show how to construct continuous non-malleable codes in the common split-state model where an encoding consist of two parts and the tampering can be arbitrary but has to be independent with both parts. Our main...... contributions are outlined below: We propose a new uniqueness requirement of split-state codes which states that it is computationally hard to find two codewords X = (X 0,X 1) and X′ = (X 0,X 1′) such that both codewords are valid, but X 0 is the same in both X and X′. A simple attack shows that uniqueness...

  20. The cosmic code comparison project

    Energy Technology Data Exchange (ETDEWEB)

    Heitmann, Katrin; Fasel, Patricia; Habib, Salman; Warren, Michael S; Ahrens, James; Ankeny, Lee; O' Shea, Brian [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Lukic, Zarija; Ricker, Paul M [Department of Astronomy, University of Illinois, Urbana, IL 61801 (United States); White, Martin [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Armstrong, Ryan [Department of Computer Science, UC Davis, Davis, CA 95616 (United States); Springel, Volker [Max-Planck-Institute for Astrophysics, 85741 Garching (Germany); Stadel, Joachim [Institute of Theoretical Physics, University of Zurich, 8057 Zurich (Switzerland); Trac, Hy [Department of Astrophysical Sciences, Princeton University, NJ 08544 (United States)], E-mail: heitmann@lanl.gov

    2008-10-01

    Current and upcoming cosmological observations allow us to probe structures on smaller and smaller scales, entering highly nonlinear regimes. In order to obtain theoretical predictions in these regimes, large cosmological simulations have to be carried out. The promised high accuracy from observations makes the simulation task very demanding: the simulations have to be at least as accurate as the observations. This requirement can only be fulfilled by carrying out an extensive code verification program. The first step of such a program is the comparison of different cosmology codes including gravitational interactions only. In this paper, we extend a recently carried out code comparison project to include five more simulation codes. We restrict our analysis to a small cosmological volume which allows us to investigate properties of halos. For the matter power spectrum and the mass function, the previous results hold, with the codes agreeing at the 10% level over wide dynamic ranges. We extend our analysis to the comparison of halo profiles and investigate the halo count as a function of local density. We introduce and discuss ParaView as a flexible analysis tool for cosmological simulations, the use of which immensely simplifies the code comparison task.

  1. Codes That Support Smart Growth Development

    Science.gov (United States)

    Provides examples of local zoning codes that support smart growth development, categorized by: unified development code, form-based code, transit-oriented development, design guidelines, street design standards, and zoning overlay.

  2. Convolutional Goppa codes defined on fibrations

    CERN Document Server

    Curto, J I Iglesias; Martín, F J Plaza; Sotelo, G Serrano

    2010-01-01

    We define a new class of Convolutional Codes in terms of fibrations of algebraic varieties generalizaing our previous constructions of Convolutional Goppa Codes. Using this general construction we can give several examples of Maximum Distance Separable (MDS) Convolutional Codes.

  3. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... National Institute of Standards and Technology International Code Council: The Update Process for the International Codes and Standards AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The International Code Council (ICC), promulgator of the International Codes and...

  4. The Influence of the Annual Number of Storms on the Derivation of the Flood Frequency Curve through Event-Based Simulation

    Directory of Open Access Journals (Sweden)

    Alvaro Sordo-Ward

    2016-08-01

    Full Text Available This study addresses the question of how to select the minimum set of storms that should be simulated each year in order to estimate an accurate flood frequency curve for return periods ranging between 1 and 1000 years. The Manzanares basin (Spain was used as a study case. A continuous 100,000-year hourly rainfall series was generated using the stochastic spatial–temporal model RanSimV3. Individual storms were extracted from the series by applying the exponential method. For each year, the extracted storms were transformed into hydrographs by applying an hourly time-step semi-distributed event-based rainfall–runoff model, and the maximum peak flow per year was determined to generate the reference flood frequency curve. Then, different flood frequency curves were obtained considering the N storms with maximum rainfall depth per year, with 1 ≤ N ≤ total number of storms. Main results show that: (a the degree of alignment between the calculated flood frequency curves and the reference flood frequency curve depends on the return period considered, increasing the accuracy for higher return periods; (b for the analyzed case studies, the flood frequency curve for medium and high return period (50 ≤ return period ≤ 1000 years can be estimated with a difference lower than 3% (compared to the reference flood frequency curve by considering the three storms with the maximum total rainfall depth each year; (c when considering only the greatest storm of the year, for return periods higher than 10 years, the difference for the estimation of the flood frequency curve is lower than 10%; and (d when considering the three greatest storms each year, for return periods higher than 100 years, the probability of achieving simultaneously a hydrograph with the annual maximum peak flow and the maximum volume is 94%.

  5. Evaluation of Code Blue Implementation Outcomes

    OpenAIRE

    Bengü Özütürk; Nalan Muhammedoğlu; Emel Dal; Berna Çalışkan

    2015-01-01

    Aim: In this study, we aimed to emphasize the importance of Code Blue implementation and to determine deficiencies in this regard. Methods: After obtaining the ethics committee approval, 225 patient’s code blue call data between 2012 and 2014 January were retrospectively analyzed. Age and gender of the patients, date and time of the call and the clinics giving Code Blue, the time needed for the Code Blue team to arrive, the rates of false Code Blue calls, reasons for Code...

  6. Quasi-cyclic unit memory convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Paaske, Erik; Ballan, Mark

    1990-01-01

    Unit memory convolutional codes with generator matrices, which are composed of circulant submatrices, are introduced. This structure facilitates the analysis of efficient search for good codes. Equivalences among such codes and some of the basic structural properties are discussed. In particular......, catastrophic encoders and minimal encoders are characterized and dual codes treated. Further, various distance measures are discussed, and a number of good codes, some of which result from efficient computer search and some of which result from known block codes, are presented...

  7. On Code Parameters and Coding Vector Representation for Practical RLNC

    DEFF Research Database (Denmark)

    Heide, Janus; Pedersen, Morten Videbæk; Fitzek, Frank

    2011-01-01

    RLNC provides a theoretically efficient method for coding. The drawbacks associated with it are the complexity of the decoding and the overhead resulting from the encoding vector. Increasing the field size and generation size presents a fundamental trade-off between packet-based throughput...... to higher energy consumption. Therefore, the optimal trade-off is system and topology dependent, as it depends on the cost in energy of performing coding operations versus transmitting data. We show that moderate field sizes are the correct choice when trade-offs are considered. The results show that sparse...

  8. A Connection between Network Coding and Convolutional Codes

    OpenAIRE

    Fragouli, C.; Soljanin, E.

    2004-01-01

    The min-cut, max-flow theorem states that a source node can send a commodity through a network to a sink node at the rate determined by the flow of the min-cut separating the source and the sink. Recently it has been shown that by liner re-encoding at nodes in communications networks, the min-cut rate can be also achieved in multicasting to several sinks. In this paper we discuss connections between such coding schemes and convolutional codes. We propose a method to simplify the convolutional...

  9. Combinatorial polarization, code loops, and codes of high level

    Directory of Open Access Journals (Sweden)

    Petr Vojtěchovský

    2004-07-01

    Full Text Available We first find the combinatorial degree of any map f:V→F, where F is a finite field and V is a finite-dimensional vector space over F. We then simplify and generalize a certain construction, due to Chein and Goodaire, that was used in characterizing code loops as finite Moufang loops that possess at most two squares. The construction yields binary codes of high divisibility level with prescribed Hamming weights of intersections of codewords.

  10. Ecoacoustic codes and ecological complexity.

    Science.gov (United States)

    Farina, Almo

    2018-02-01

    Multi-layer communication and sensing network assures the exchange of relevant information between animals and their umwelten, imparting complexity to the ecological systems. Individual soniferous species, the acoustic community, and soundscape are the three main operational levels that comprise this multi-layer network. Acoustic adaptation and acoustic niche are two more important mechanisms that regulate the acoustic performances at the first level while the acoustic community model explains the complexity of the interspecific acoustic network at the second level. Acoustic habitat and ecoacoustic events are two of the most relevant mechanisms that operate at the third level. The exchange of ecoacoustic information on each of these levels is assured by ecoacoustic codes. At the level of individual sonifeorus species, a dyadic intraspecific exchange of information is established between an emitter and a receiver. Ecoacoustic codes discriminate, identify, and label specific signals that pertain to the theme, variation, motif repetition, and intensity of signals. At the acoustic community level, a voluntarily or involuntarily communication is established between networks of interspecific emitters and receivers. Ecoacoustic codes at this level transmit information (e.g., recognition of predators, location of food sources, availability and location of refuges) between one species and the acoustically interacting community and impart cohesion to interspecific assemblages. At the soundscape level, acoustic information is transferred from a mosaic of geophonies, biophonies, and technophonies to different species that discriminate meaningful ecoacoustic events and their temporal dynamics during habitat selection processes. Ecoacoustic codes at this level operate on a limited set of signals from the environmental acoustic dynamic that are heterogeneous in time and space, and these codes are interpreted differently according to the species during habitat selection and the

  11. The SHIELD11 Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W

    2005-02-02

    SHIELD11 is a computer code for performing shielding analyses around a high-energy electron accelerator. It makes use of simple analytic expressions for the production and attenuation of photons and neutrons resulting from electron beams striking thick targets, such as dumps, stoppers, collimators, and other beam devices. The formulae in SHIELD11 are somewhat unpretentious in that they are based on the extrapolation (scaling) of experimental data using rather simple physics ideas. Because these scaling methods have only been tested over a rather limited set of conditions--namely, 1-15 GeV electrons striking 10-20 radiation lengths of iron--a certain amount of care and judgment must be exercised whenever SHIELD11 is used. Nevertheless, for many years these scaling methods have been applied rather successfully to a large variety of problems at SLAC, as well as at other laboratories throughout the world, and the SHIELD11 code has been found to be a fast and convenient tool. In this paper we present, without extensive theoretical justification or experimental verification, the five-component model on which the SHIELD11 code is based. Our intent is to demonstrate how to use the code by means of a few simple examples. References are provided that are considered to be essential for a full understanding of the model. The code itself contains many comments to provide some guidance for the informed user, who may wish to improve on the model.

  12. Index coding via linear programming

    CERN Document Server

    Blasiak, Anna; Lubetzky, Eyal

    2010-01-01

    Index Coding has received considerable attention recently motivated in part by applications such as fast video-on-demand and efficient communication in wireless networks and in part by its connection to Network Coding. The basic setting of Index Coding encodes the side-information relation, the problem input, as an undirected graph and the fundamental parameter is the broadcast rate $\\beta$, the average communication cost per bit for sufficiently long messages (i.e. the non-linear vector capacity). Recent nontrivial bounds on $\\beta$ were derived from the study of other Index Coding capacities (e.g. the scalar capacity $\\beta_1$) by Bar-Yossef et al (FOCS'06), Lubetzky and Stav (FOCS'07) and Alon et al (FOCS'08). However, these indirect bounds shed little light on the behavior of $\\beta$ and its exact value remained unknown for \\emph{any graph} where Index Coding is nontrivial. Our main contribution is a hierarchy of linear programs whose solutions trap $\\beta$ between them. This enables a direct information-...

  13. Using Binary Code Instrumentation in Computer Security

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2013-01-01

    Full Text Available The paper approaches the low-level details of the code generated by compilers whose format permits outside actions. Binary code modifications are manually done when the internal format is known and understood, or automatically by certain tools developed to process the binary code. The binary code instrumentation goals may be various from security increasing and bug fixing to development of malicious software. The paper highlights the binary code instrumentation techniques by code injection to increase the security and reliability of a software application. Also, the paper offers examples for binary code formats understanding and how the binary code injection may be applied.

  14. Requirements of a Better Secure Program Coding

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2012-01-01

    Full Text Available Secure program coding refers to how manage the risks determined by the security breaches because of the program source code. The papers reviews the best practices must be doing during the software development life cycle for secure software assurance, the methods and techniques used for a secure coding assurance, the most known and common vulnerabilities determined by a bad coding process and how the security risks are managed and mitigated. As a tool of the better secure program coding, the code review process is presented, together with objective measures for code review assurance and estimation of the effort for the code improvement.

  15. On the Dimension of Graph Codes with Reed–Solomon Component Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Høholdt, Tom; Pinero, Fernando

    2013-01-01

    We study a class of graph based codes with Reed-Solomon component codes as affine variety codes. We give a formulation of the exact dimension of graph codes in general. We give an algebraic description of these codes which makes the exact computation of the dimension of the graph codes easier....

  16. A genetic scale of reading frame coding.

    Science.gov (United States)

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier

  17. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  18. FLOWTRAN-TF code description

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P. (ed.)

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  19. FLOWTRAN-TF code description

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P. (ed.)

    1991-09-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  20. Halftone Coding with JBIG2

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    2000-01-01

    The emerging international standard for compression of bilevel images and bi-level documents, JBIG2,provides a mode dedicated for lossy coding of halftones. The encoding procedure involves descreening of the bi-levelimage into gray-scale, encoding of the gray-scale image, and construction...... of a halftone pattern dictionary.The decoder first decodes the gray-scale image. Then for each gray-scale pixel looks up the corresponding halftonepattern in the dictionary and places it in the reconstruction bitmap at the position corresponding to the gray-scale pixel. The coding method is inherently lossy...... and care must be taken to avoid introducing artifacts in the reconstructed image. We describe how to apply this coding method for halftones created by periodic ordered dithering, by clustered dot screening (offset printing), and by techniques which in effect dithers with blue noise, e.g., error diffusion...

  1. MAGNETOHYDRODYNAMIC EQUATIONS (MHD GENERATION CODE

    Directory of Open Access Journals (Sweden)

    Francisco Frutos Alfaro

    2017-04-01

    Full Text Available A program to generate codes in Fortran and C of the full magnetohydrodynamic equations is shown. The program uses the free computer algebra system software REDUCE. This software has a package called EXCALC, which is an exterior calculus program. The advantage of this program is that it can be modified to include another complex metric or spacetime. The output of this program is modified by means of a LINUX script which creates a new REDUCE program to manipulate the magnetohydrodynamic equations to obtain a code that can be used as a seed for a magnetohydrodynamic code for numerical applications. As an example, we present part of the output of our programs for Cartesian coordinates and how to do the discretization.

  2. Ultrasound imaging using coded signals

    DEFF Research Database (Denmark)

    Misaridis, Athanasios

    Modulated (or coded) excitation signals can potentially improve the quality and increase the frame rate in medical ultrasound scanners. The aim of this dissertation is to investigate systematically the applicability of modulated signals in medical ultrasound imaging and to suggest appropriate...... of the excitation signal. Although a gain in signal-to-noise ratio of about 20 dB is theoretically possible for the time-bandwidth product available in ultrasound, it is shown that the effects of transducer weighting and tissue attenuation reduce the maximum gain at 10 dB for robust compression with low sidelobes...... is described. Application of coded excitation in array imaging is evaluated through simulations in Field II. The low degree of the orthogonality among coded signals for ultrasound systems is first discussed, and the effect of mismatched filtering in the cross-correlation properties of the signals is evaluated...

  3. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  4. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  5. Benchmarking of calculated projectile fragmentation cross-sections using the 3-D, MC codes PHITS, FLUKA, HETC-HEDS, MCNPX_HI, and NUCFRG2

    Science.gov (United States)

    Sihver, L.; Mancusi, D.; Niita, K.; Sato, T.; Townsend, L.; Farmer, C.; Pinsky, L.; Ferrari, A.; Cerutti, F.; Gomes, I.

    Particles and heavy ions are used in various fields of nuclear physics, medical physics, and material science, and their interactions with different media, including human tissue and critical organs, have therefore carefully been investigated both experimentally and theoretically since the 1930s. However, heavy-ion transport includes many complex processes and measurements for all possible systems, including critical organs, would be impractical or too expensive; e.g. direct measurements of dose equivalents to critical organs in humans cannot be performed. A reliable and accurate particle and heavy-ion transport code is therefore an essential tool in the design study of accelerator facilities as well as for other various applications. Recently, new applications have also arisen within transmutation and reactor science, space and medicine, especially radiotherapy, and several accelerator facilities are operating or planned for construction. Accurate knowledge of the physics of interaction of particles and heavy ions is also necessary for estimating radiation damage to equipment used on space vehicles, to calculate the transport of the heavy ions in the galactic cosmic ray (GCR) through the interstellar medium, and the evolution of the heavier elements after the Big Bang. Concerns about the biological effect of space radiation and space dosimetry are increasing rapidly due to the perspective of long-duration astronaut missions, both in relation to the International Space Station and to manned interplanetary missions in near future. Radiation protection studies for crews of international flights at high altitude have also received considerable attention in recent years. There is therefore a need to develop accurate and reliable particle and heavy-ion transport codes. To be able to calculate complex geometries, including production and transport of protons, neutrons, and alpha particles, 3-dimensional transport using Monte Carlo (MC) technique must be used. Today

  6. language choice, code-switching and code- mixing in biase

    African Journals Online (AJOL)

    Ada

    switching and code- mixing in a multi-lingual Biase Local Government Area in Cross River State, Nigeria. It looks at the different languages spoken in Biase - from the local languages which serve as mother tongues (MT/L1) to other languages in use in ...

  7. Stakeholders' Opinions on the use of Code Switching/ Code Mixing ...

    African Journals Online (AJOL)

    This paper focuses on the opinions of stakeholders on the use of codeswitching for teaching and learning in Tanzania secondary schools althoughexaminations are set in English. English-Kiswahili code switching is employedintensively in the classrooms by both teachers and learners, as a coping strategy toattain ...

  8. Signal Constellations for Multilevel Coded Modulation with Sparse Graph Codes

    NARCIS (Netherlands)

    Cronie, H.S.

    2005-01-01

    A method to combine error-correction coding and spectral efficient modulation for transmission over channels with Gaussian noise is presented. The method of modulation leads to a signal constellation in which the constellation symbols have a nonuniform distribution. This gives a so-called shape gain

  9. Opportunistic Adaptive Transmission for Network Coding Using Nonbinary LDPC Codes

    Directory of Open Access Journals (Sweden)

    Cocco Giuseppe

    2010-01-01

    Full Text Available Network coding allows to exploit spatial diversity naturally present in mobile wireless networks and can be seen as an example of cooperative communication at the link layer and above. Such promising technique needs to rely on a suitable physical layer in order to achieve its best performance. In this paper, we present an opportunistic packet scheduling method based on physical layer considerations. We extend channel adaptation proposed for the broadcast phase of asymmetric two-way bidirectional relaying to a generic number of sinks and apply it to a network context. The method consists of adapting the information rate for each receiving node according to its channel status and independently of the other nodes. In this way, a higher network throughput can be achieved at the expense of a slightly higher complexity at the transmitter. This configuration allows to perform rate adaptation while fully preserving the benefits of channel and network coding. We carry out an information theoretical analysis of such approach and of that typically used in network coding. Numerical results based on nonbinary LDPC codes confirm the effectiveness of our approach with respect to previously proposed opportunistic scheduling techniques.

  10. Some partial-unit-memory convolutional codes

    Science.gov (United States)

    Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.

    1991-01-01

    The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.

  11. Validation issues for SSI codes

    Energy Technology Data Exchange (ETDEWEB)

    Philippacopoulos, A.J.

    1995-02-01

    The paper describes the results of a recent work which was performed to verify computer code predictions in the SSI area. The first part of the paper is concerned with analytic solutions of the system response. The mathematical derivations are reasonably reduced by the use of relatively simple models which capture fundamental ingredients of the physics of the system motion while allowing for the response to be obtained analytically. Having established explicit forms of the system response, numerical solutions from three computer codes are presented in comparative format.

  12. List Decoding of Matrix-Product Codes from nested codes: an application to Quasi-Cyclic codes

    DEFF Research Database (Denmark)

    Hernando, Fernando; Høholdt, Tom; Ruano, Diego

    2012-01-01

    A list decoding algorithm for matrix-product codes is provided when $C_1,..., C_s$ are nested linear codes and $A$ is a non-singular by columns matrix. We estimate the probability of getting more than one codeword as output when the constituent codes are Reed-Solomon codes. We extend this list...

  13. Code blue: what to do?

    Science.gov (United States)

    Porteous, Joan

    2009-09-01

    Cardiac arrest may occur intraoperatively at any time. The purpose of this article is to help the reader recognize and assist in the management of an intraoperative cardiac arrest. Patients who are at risk for cardiac arrest in the OR are identified and different types of pulseless arrythmias are identified. Roles of perioperative personnel are suggested and documentation during the code is discussed.

  14. Coding as literacy metalithikum IV

    CERN Document Server

    Bühlmann, Vera; Moosavi, Vahid

    2015-01-01

    Recent developments in computer science, particularly "data-driven procedures" have opened a new level of design and engineering. This has also affected architecture. The publication collects contributions on Coding as Literacy by computer scientists, mathematicians, philosophers, cultural theorists, and architects. "Self-Organizing Maps" (SOM) will serve as the concrete reference point for all further discussions.

  15. Coding and English Language Teaching

    Science.gov (United States)

    Stevens, Vance; Verschoor, Jennifer

    2017-01-01

    According to Dudeney, Hockly, and Pegrum (2013) coding is a deeper skill subsumed under the four main digital literacies of language, connections, information, and (re)design. Coders or programmers are people who write the programmes behind everything we see and do on a computer. Most students spend several hours playing online games, but few know…

  16. Smells in software test code

    NARCIS (Netherlands)

    Garousi, Vahid; Küçük, Barış

    2018-01-01

    As a type of anti-pattern, test smells are defined as poorly designed tests and their presence may negatively affect the quality of test suites and production code. Test smells are the subject of active discussions among practitioners and researchers, and various guidelines to handle smells are

  17. Reusable State Machine Code Generator

    Science.gov (United States)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  18. Code Properties from Holographic Geometries

    Directory of Open Access Journals (Sweden)

    Fernando Pastawski

    2017-05-01

    Full Text Available Almheiri, Dong, and Harlow [J. High Energy Phys. 04 (2015 163.JHEPFG1029-847910.1007/JHEP04(2015163] proposed a highly illuminating connection between the AdS/CFT holographic correspondence and operator algebra quantum error correction (OAQEC. Here, we explore this connection further. We derive some general results about OAQEC, as well as results that apply specifically to quantum codes that admit a holographic interpretation. We introduce a new quantity called price, which characterizes the support of a protected logical system, and find constraints on the price and the distance for logical subalgebras of quantum codes. We show that holographic codes defined on bulk manifolds with asymptotically negative curvature exhibit uberholography, meaning that a bulk logical algebra can be supported on a boundary region with a fractal structure. We argue that, for holographic codes defined on bulk manifolds with asymptotically flat or positive curvature, the boundary physics must be highly nonlocal, an observation with potential implications for black holes and for quantum gravity in AdS space at distance scales that are small compared to the AdS curvature radius.

  19. QR Codes: Taking Collections Further

    Science.gov (United States)

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  20. Code Properties from Holographic Geometries

    Science.gov (United States)

    Pastawski, Fernando; Preskill, John

    2017-04-01

    Almheiri, Dong, and Harlow [J. High Energy Phys. 04 (2015) 163., 10.1007/JHEP04(2015)163] proposed a highly illuminating connection between the AdS /CFT holographic correspondence and operator algebra quantum error correction (OAQEC). Here, we explore this connection further. We derive some general results about OAQEC, as well as results that apply specifically to quantum codes that admit a holographic interpretation. We introduce a new quantity called price, which characterizes the support of a protected logical system, and find constraints on the price and the distance for logical subalgebras of quantum codes. We show that holographic codes defined on bulk manifolds with asymptotically negative curvature exhibit uberholography, meaning that a bulk logical algebra can be supported on a boundary region with a fractal structure. We argue that, for holographic codes defined on bulk manifolds with asymptotically flat or positive curvature, the boundary physics must be highly nonlocal, an observation with potential implications for black holes and for quantum gravity in AdS space at distance scales that are small compared to the AdS curvature radius.

  1. Generating Constant Weight Binary Codes

    Science.gov (United States)

    Knight, D.G.

    2008-01-01

    The determination of bounds for A(n, d, w), the maximum possible number of binary vectors of length n, weight w, and pairwise Hamming distance no less than d, is a classic problem in coding theory. Such sets of vectors have many applications. A description is given of how the problem can be used in a first-year undergraduate computational…

  2. Three-dimensional stellarator codes.

    Science.gov (United States)

    Garabedian, P R

    2002-08-06

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory.

  3. Three-dimensional stellarator codes

    Science.gov (United States)

    Garabedian, P. R.

    2002-01-01

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory. PMID:12140367

  4. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  5. The Minimum Distance of Graph Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2011-01-01

    We study codes constructed from graphs where the code symbols are associated with the edges and the symbols connected to a given vertex are restricted to be codewords in a component code. In particular we treat such codes from bipartite expander graphs coming from Euclidean planes and other...

  6. Principled Syntactic Code Completion using Placeholders

    NARCIS (Netherlands)

    De Souza Amorim, L.E.; Erdweg, S.T.; Wachsmuth, G.H.; Visser, Eelco; Varro, D.; Balland, E.; van der Storm, T.

    2016-01-01

    Principled syntactic code completion enables developers to change source code by inserting code templates, thus increasing developer efficiency and supporting language exploration. However, existing code completion systems are ad-hoc and neither complete nor sound. They are not complete and only

  7. TOCAR: a code to interface FOURACES - CARNAVAL

    Energy Technology Data Exchange (ETDEWEB)

    Panini, G.C.; Vaccari, M.

    1981-08-01

    The TOCAR code, written in FORTRAN-IV for IBM-370 computers, is an interface between the output of the FOURACES code and the CARNAVAL binary format for the multigroup neutron cross-sections, scattering matrices and related quantities. Besides the description of the code and the how to use, the report contains the code listing.

  8. Elevating The Status of Code in Ecology.

    Science.gov (United States)

    Mislan, K A S; Heer, Jeffrey M; White, Ethan P

    2016-01-01

    Code is increasingly central to ecological research but often remains unpublished and insufficiently recognized. Making code available allows analyses to be more easily reproduced and can facilitate research by other scientists. We evaluate journal handling of code, discuss barriers to its publication, and suggest approaches for promoting and archiving code. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  10. On the weight distribution of convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H

    2005-01-01

    Detailed information about the weight distribution of a convolutional code is given by the adjacency matrix of the state diagram associated with a minimal realization of the code. We will show that this matrix is an invariant of the code. Moreover, it will be proven that codes with the same

  11. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  12. New convolutional code constructions and a class of asymptotically good time-varying codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1973-01-01

    We show that the generator polynomials of certain cyclic codes define noncatastrophic fixed convolutional codes whose free distances are lowerbounded by the minimum distances of the cyclic codes. This result is used to construct convolutioual codes with free distance equal to the constraint length...... and to derive convolutional codes with good free distances from the BCH codes. Finally, a class of time-varying codes is constructed for which the free distance increases linearly with the constraint length....

  13. Code Carnivals: resuscitating Code Blue training with accelerated learning.

    Science.gov (United States)

    Keys, Vicky A; Malone, Peggy; Brim, Carla; Schoonover, Heather; Nordstrom, Cindy; Selzler, Melissa

    2009-12-01

    Nurses in the hospital setting must be knowledgeable about resuscitation procedures and proficient in the delivery of care during an emergency. They must be ready to implement their knowledge and skills at a moment's notice. A common dilemma for many nurses is that cardiopulmonary emergencies (Code Blues) are infrequent occurrences. Therefore, how do nurses remain competent and confident in their implementation of emergency skills while having limited exposure to the equipment and minimal experience in emergency situations? A team of nurse educators at a regional medical center in Washington State applied adult learning theory and accelerated learning techniques to develop and present a series of learning activities to enhance the staff's familiarity with emergency equipment and procedures. The series began with a carnival venue that provided hands-on practice and review of emergency skills and was reinforced with subsequent random unannounced code drills led by both educators and charge nurses. Copyright 2009, SLACK Incorporated.

  14. The chromatin regulatory code: Beyond a histone code

    Science.gov (United States)

    Lesne, A.

    2006-03-01

    In this commentary on the contribution by Arndt Benecke in this issue, I discuss why the notion of “chromatin code” introduced and elaborated in this paper is to be preferred to that of “histone code”. Speaking of a code as regards nucleosome conformation and histone tail post-translational modifications only makes sense within the chromatin fiber, where their physico-chemical features can be translated into regulatory programs at the genome level, by means of a complex, multi-level interplay with the fiber architecture and dynamics settled in the course of Evolution. In particular, this chromatin code presumably exploits allosteric transitions of the chromatin fiber. The chromatin structure dependence of its translation suggests two alternative modes of transcription initiation regulation, also proposed in the paper by A. Benecke in this issue for interpreting strikingly bimodal micro-array data.

  15. Code-excited linear predictive coding of multispectral MR images

    Science.gov (United States)

    Hu, Jian-Hong; Wang, Yao; Cahill, Patrick

    1996-02-01

    This paper reports a multispectral code excited linear predictive coding method for the compression of well-registered multispectral MR images. Different linear prediction models and the adaptation schemes have been compared. The method which uses forward adaptive autoregressive (AR) model has proven to achieve a good compromise between performance, complexity and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over non-overlapping square macroblocks. Each macro-block is further divided into several micro-blocks and, the best excitation signals for each microblock are determined through an analysis-by-synthesis procedure. To satisfy the high quality requirement for medical images, the error between the original images and the synthesized ones are further specified using a vector quantizer. The MFCELP method has been applied to 26 sets of clinical MR neuro images (20 slices/set, 3 spectral bands/slice, 256 by 256 pixels/image, 12 bits/pixel). It provides a significant improvement over the discrete cosine transform (DCT) based JPEG method, a wavelet transform based embedded zero-tree wavelet (EZW) coding method, as well as the MSARMA method we developed before.

  16. Genetic coding and gene expression - new Quadruplet genetic coding model

    Science.gov (United States)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  17. Box codes of lengths 48 and 72

    Science.gov (United States)

    Solomon, G.; Jin, Y.

    1993-01-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  18. Amino acid codes in mitochondria as possible clues to primitive codes

    Science.gov (United States)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  19. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    Science.gov (United States)

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  20. Some Families of Asymmetric Quantum MDS Codes Constructed from Constacyclic Codes

    Science.gov (United States)

    Huang, Yuanyuan; Chen, Jianzhang; Feng, Chunhui; Chen, Riqing

    2017-10-01

    Quantum maximal-distance-separable (MDS) codes that satisfy quantum Singleton bound with different lengths have been constructed by some researchers. In this paper, seven families of asymmetric quantum MDS codes are constructed by using constacyclic codes. We weaken the case of Hermitian-dual containing codes that can be applied to construct asymmetric quantum MDS codes with parameters [[n,k,dz/dx

  1. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... which cycle, go to: http://www.iccsafe.org/cs/codes/Web pages/cycle.aspx. The Code Development Process..., Country Club Hills, Illinois 60478; or download a copy from the ICC Web site noted previously. The... Code. International Property Maintenance Code. International Residential Code. International Swimming...

  2. Predictive coding in Agency Detection

    DEFF Research Database (Denmark)

    Andersen, Marc Malmdorf

    2017-01-01

    , unbeknownst to consciousness, engages in sophisticated Bayesian statistics in an effort to constantly predict the hidden causes of sensory input. My fundamental argument is that most false positives in agency detection can be seen as the result of top-down interference in a Bayesian system generating high...... prior probabilities in the face of unreliable stimuli, and that such a system can better account for the experimental evidence than previous accounts of a dedicated agency detection system. Finally, I argue that adopting predictive coding as a theoretical framework has radical implications......Agency detection is a central concept in the cognitive science of religion (CSR). Experimental studies, however, have so far failed to lend support to some of the most common predictions that follow from current theories on agency detection. In this article, I argue that predictive coding, a highly...

  3. XSTAR Code and Database Status

    Science.gov (United States)

    Kallman, Timothy R.

    2017-08-01

    The XSTAR code is a simulation tool for calculating spectra associated with plasmas which are in a time-steady balance among the microphysical processes. It allows for treatment of plasmas which are exposed to illumination by energetic photons, but also treats processes relevant to collision-dominated plasmas. Processes are treated in a full collisional-radiative formalism which includes convergence to local thermodynamic equilibrium under suitable conditions. It features an interface to the most widely used software for fitting to astrophysical spectra, and has also been compared with laboratory plasma experiments. This poster will describe the recent updates to XSTAR, including atomic data, new features, and some recent applications of the code.

  4. The EGS5 Code System

    Energy Technology Data Exchange (ETDEWEB)

    Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  5. Prospective coding in event representation.

    Science.gov (United States)

    Schütz-Bosbach, Simone; Prinz, Wolfgang

    2007-06-01

    A perceived event such as a visual stimulus in the external world and a to-be-produced event such as an intentional action are subserved by event representations. Event representations do not only contain information about present states but also about past and future states. Here we focus on the role of representing future states in event perception and generation (i.e., prospective coding). Relevant theoretical issues and paradigms are discussed. We suggest that the predictive power of the motor system may be exploited for prospective coding not only in producing but also in perceiving events. Predicting is more advantageous than simply reacting. Perceptual prediction allows us to select appropriate responses ahead of the realization of an (anticipated) event and therefore, it is indispensable to flexibly and timely adapt to new situations and thus, successfully interact with our physical and social environment.

  6. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...... incorporate content caching and storage, all of which are key challenges of the future Internet and the upcoming 5G networks. This paper proposes some of the keys behind this intersection and supports it with use cases as well as a an implementation that integrated the Kodo library (NC) into OpenFlow (SDN...

  7. Code Development for Collective Effects

    CERN Document Server

    Bruce Li, Kevin Shing; Hegglin, Stefan Eduard; Iadarola, Giovanni; Oeftiger, Adrian; Passarelli, Andrea; Romano, Annalisa; Rumolo, Giovanni; Schenk, Michael; CERN. Geneva. ATS Department

    2016-01-01

    The presentation will cover approaches and strategies of modeling and implementing collective effects in modern simulation codes. We will review some of the general approaches to numerically model collective beam dynamics in circular accelerators. We will then look into modern ways of implementing collective effects with a focus on plainness, modularity and flexibility, using the example of the PyHEADTAIL framework, and highlight some of the advantages and drawbacks emerging from this method. To ameliorate one of the main drawbacks, namely a potential loss of performance compared to the classical fully compiled codes, several options for speed improvements will be mentioned and discussed. Finally some examples and applications will be shown together with future plans and perspectives.

  8. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  9. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  10. Do you write secure code?

    CERN Multimedia

    Computer Security Team

    2011-01-01

    At CERN, we are excellent at producing software, such as complex analysis jobs, sophisticated control programs, extensive monitoring tools, interactive web applications, etc. This software is usually highly functional, and fulfils the needs and requirements as defined by its author. However, due to time constraints or unintentional ignorance, security aspects are often neglected. Subsequently, it was even more embarrassing for the author to find out that his code flawed and was used to break into CERN computers, web pages or to steal data…   Thus, if you have the pleasure or task of producing software applications, take some time before and familiarize yourself with good programming practices. They should not only prevent basic security flaws in your code, but also improve its readability, maintainability and efficiency. Basic rules for good programming, as well as essential books on proper software development, can be found in the section for software developers on our security we...

  11. Introduction of the ASGARD Code

    Science.gov (United States)

    Bethge, Christian; Winebarger, Amy; Tiwari, Sanjiv; Fayock, Brian

    2017-01-01

    ASGARD stands for 'Automated Selection and Grouping of events in AIA Regional Data'. The code is a refinement of the event detection method in Ugarte-Urra & Warren (2014). It is intended to automatically detect and group brightenings ('events') in the AIA EUV channels, to record event parameters, and to find related events over multiple channels. Ultimately, the goal is to automatically determine heating and cooling timescales in the corona and to significantly increase statistics in this respect. The code is written in IDL and requires the SolarSoft library. It is parallelized and can run with multiple CPUs. Input files are regions of interest (ROIs) in time series of AIA images from the JSOC cutout service (http://jsoc.stanford.edu/ajax/exportdata.html). The ROIs need to be tracked, co-registered, and limited in time (typically 12 hours).

  12. The Accurate Particle Tracer Code

    CERN Document Server

    Wang, Yulei; Qin, Hong; Yu, Zhi

    2016-01-01

    The Accurate Particle Tracer (APT) code is designed for large-scale particle simulations on dynamical systems. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and non-linear problems. Under the well-designed integrated and modularized framework, APT serves as a universal platform for researchers from different fields, such as plasma physics, accelerator physics, space science, fusion energy research, computational mathematics, software engineering, and high-performance computation. The APT code consists of seven main modules, including the I/O module, the initialization module, the particle pusher module, the parallelization module, the field configuration module, the external force-field module, and the extendible module. The I/O module, supported by Lua and Hdf5 projects, provides a user-friendly interface for both numerical simulation and data analysis. A series of new geometric numerical methods...

  13. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    AFRL-AFOSR-JP-TR-2017-0015 Verified OS Interface Code Synthesis Gerwin Klein NATIONAL ICT AUSTRALIA LIMITED Final Report 02/14/2017 DISTRIBUTION A...ORGANIZATION NAME(S) AND ADDRESS(ES) NATIONAL ICT AUSTRALIA LIMITED L 5 13 GARDEN ST EVELEIGH, 2015 AU 8. PERFORMING ORGANIZATION REPORT NUMBER 9...public release: distribution unlimited. 1 Introduction The central question of this project was how to ensure the correctness of Operating System (OS

  14. HADES, A Radiographic Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M.B.; Slone, D.M.; Schach von Wittenau, A.E.

    2000-08-18

    We describe features of the HADES radiographic simulation code. We begin with a discussion of why it is useful to simulate transmission radiography. The capabilities of HADES are described, followed by an application of HADES to a dynamic experiment recently performed at the Los Alamos Neutron Science Center. We describe quantitative comparisons between experimental data and HADES simulations using a copper step wedge. We conclude with a short discussion of future work planned for HADES.

  15. Coded continuous wave meteor radar

    Science.gov (United States)

    Chau, J. L.; Vierinen, J.; Pfeffer, N.; Clahsen, M.; Stober, G.

    2016-12-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products, such as wind fields. This type of a radar would also be useful for over-the-horizon radar, ionosondes, and observations of field-aligned-irregularities.

  16. Clean Code - Why you should care

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    - Martin Fowler Writing code is communication, not solely with the computer that executes it, but also with other developers and with oneself. A developer spends a lot of his working time reading and understanding code that was written by other developers or by himself in the past. The readability of the code plays an important factor for the time to find a bug or add new functionality, which in turn has a big impact on the productivity. Code that is difficult to undestand, hard to maintain and refactor, and offers many spots for bugs to hide is not considered to be "clean code". But what could considered as "clean code" and what are the advantages of a strict application of its guidelines? In this presentation we will take a look on some typical "code smells" and proposed guidelines to improve your coding skills to write cleaner code that is less bug prone and better to maintain.

  17. Penal Code, 24 June 1987.

    Science.gov (United States)

    1987-01-01

    This document contains provisions of Liechtenstein's 1987 Penal Code relating to sterilization, abortion, polygamy, the protection of women and children, crimes related to marriage, and failure to provide support. The Code holds that sexual sterilization carried out at the patient's request is lawful if the patient is at least 25 years old. Performing or inducing an abortion is punishable with imprisonment unless: 1) the abortion is necessary to prevent serious danger to the life or health of the pregnant woman, 2) the pregnant woman was under 14 years old and not married to the man who impregnated her, or 3) the abortion is performed to save the woman's life. The Code also imposes a prison sentence on anyone abducting a woman who is helpless or unable to resist in order to sexually abuse the woman. Bigamy carries a prison term of up to 3 years, and a prison term of up to 1 year is applied in cases where a person deceives another or compels another into marriage. Removing a minor from the control of those authorized to rear said minor can lead to a prison term of up to 1 year, and abandonment of a minor can lead to a prison term of up to 3 years. Violation of the duty of financial support called for by family law can invoke a prison term of up to 6 months.

  18. Special issue on network coding

    Science.gov (United States)

    Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly

    2017-12-01

    Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.

  19. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  20. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  1. Advanced Radiation Protection (ARP): Thick GCR Shield Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Radiation Project to date has focused on SEP events.  For long duration missions outside Earth’s geomagnetic field, the galactic cosmic ray...

  2. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  3. Tandem Mirror Reactor Systems Code (Version I)

    Energy Technology Data Exchange (ETDEWEB)

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.; Barrett, R.J.; Gorker, G.E.; Spampinaton, P.T.; Bulmer, R.H.; Dorn, D.W.; Perkins, L.J.; Ghose, S.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost.

  4. A Mobile Application Prototype using Network Coding

    DEFF Research Database (Denmark)

    Pedersen, Morten Videbæk; Heide, Janus; Fitzek, Frank

    2010-01-01

    This paper looks into implementation details of network coding for a mobile application running on commercial mobile phones. We describe the necessary coding operations and algorithms that implements them. The coding algorithms forms the basis for a implementation in C++ and Symbian C++. We report...... on practical measurement results of coding throughput and energy consumption for a single-source multiple-sinks network, with and without recoding at the sinks. These results confirm that network coding is practical even on computationally weak platforms, and that network coding potentially can be used...

  5. Toric Codes, Multiplicative Structure and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    2017-01-01

    and aligns with decoding by error correcting pairs. We have used the multiplicative structure on toric codes to construct linear secret sharing schemes with \\emph{strong multiplication} via Massey's construction generalizing the Shamir Linear secret sharing shemes constructed from Reed-Solomon codes. We have...... constructed quantum error correcting codes from toric surfaces by the Calderbank-Shor-Steane method.......Long linear codes constructed from toric varieties over finite fields, their multiplicative structure and decoding. The main theme is the inherent multiplicative structure on toric codes. The multiplicative structure allows for \\emph{decoding}, resembling the decoding of Reed-Solomon codes...

  6. An implicit Smooth Particle Hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Charles E. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  7. On the extention of propelinear structures of Nordstrom-Robinson code to Hamming code

    OpenAIRE

    Mogilnykh, I. Yu.

    2015-01-01

    A code is called propelinear if its automorphism group contains a subgroup that acts regularly on its codewords, which is called a propelinear structure on the code. In the paper a classification of the propelinear structures on the Nordstrom-Robinson code is obtained and the question of extension of these structures to propelinear structures of the Hamming code, that contains the Nordstrom-Robinson code. The result partially relies on a representation of all partitions of the Hamming code in...

  8. Some optimal partial-unit-memory codes. [time-invariant binary convolutional codes

    Science.gov (United States)

    Lauer, G. S.

    1979-01-01

    A class of time-invariant binary convolutional codes is defined, called partial-unit-memory codes. These codes are optimal in the sense of having maximum free distance for given values of R, k (the number of encoder inputs), and mu (the number of encoder memory cells). Optimal codes are given for rates R = 1/4, 1/3, 1/2, and 2/3, with mu not greater than 4 and k not greater than mu + 3, whenever such a code is better than previously known codes. An infinite class of optimal partial-unit-memory codes is also constructed based on equidistant block codes.

  9. P-code versus C/A-code GPS for range tracking applications

    Science.gov (United States)

    Hoefener, Carl E.; van Wechel, Bob

    This article compares the use of P-code and C/A-code GPS receivers on test and training ranges. The requirements on many ranges for operation under conditions of jamming preclude the use of C/A-code receivers because of their relatively low jamming immunity as compared with P-code receivers. Also, C/A-code receivers present some problems when used with pseudolites on ranges. The cost of P-code receivers is customarily much higher than that of C/A-code receivers. However, most of this difference is caused by factors other than P-code, particularly the parts screening specifications applied to military programs.

  10. Self-orthogonal codes with dual distance three and quantum codes with distance three over

    Science.gov (United States)

    Liang, Fangchi

    2013-12-01

    Self-orthogonal codes with dual distance three and quantum codes with distance three constructed from self-orthogonal codes over are discussed in this paper. Firstly, for given code length , a self-orthogonal code with minimal dimension and dual distance three is constructed. Secondly, for each , two nested self-orthogonal codes with dual distance two and three are constructed, and consequently quantum code of length and distance three is constructed via Steane construction. All of these quantum codes constructed via Steane construction are optimal or near optimal according to the quantum Hamming bound.

  11. Molecular and Functional Characterization of GR2-R1 Event Based Backcross Derived Lines of Golden Rice in the Genetic Background of a Mega Rice Variety Swarna.

    Directory of Open Access Journals (Sweden)

    Haritha Bollinedi

    Full Text Available Homozygous Golden Rice lines developed in the background of Swarna through marker assisted backcross breeding (MABB using transgenic GR2-R1 event as a donor for the provitamin A trait have high levels of provitamin A (up to 20 ppm but are dwarf with pale green leaves and drastically reduced panicle size, grain number and yield as compared to the recurrent parent, Swarna. In this study, we carried out detailed morphological, biochemical and molecular characterization of these lines in a quest to identify the probable reasons for their abnormal phenotype. Nucleotide blast analysis with the primer sequences used to amplify the transgene revealed that the integration of transgene disrupted the native OsAux1 gene, which codes for an auxin transmembrane transporter protein. Real time expression analysis of the transgenes (ZmPsy and CrtI driven by endosperm-specific promoter revealed the leaky expression of the transgene in the vegetative tissues. We propose that the disruption of OsAux1 disturbed the fine balance of plant growth regulators viz., auxins, gibberellic acid and abscisic acid, leading to the abnormalities in the growth and development of the lines homozygous for the transgene. The study demonstrates the conserved roles of OsAux1 gene in rice and Arabidopsis.

  12. Molecular and Functional Characterization of GR2-R1 Event Based Backcross Derived Lines of Golden Rice in the Genetic Background of a Mega Rice Variety Swarna.

    Science.gov (United States)

    Bollinedi, Haritha; S, Gopala Krishnan; Prabhu, Kumble Vinod; Singh, Nagendra Kumar; Mishra, Sushma; Khurana, Jitendra P; Singh, Ashok Kumar

    2017-01-01

    Homozygous Golden Rice lines developed in the background of Swarna through marker assisted backcross breeding (MABB) using transgenic GR2-R1 event as a donor for the provitamin A trait have high levels of provitamin A (up to 20 ppm) but are dwarf with pale green leaves and drastically reduced panicle size, grain number and yield as compared to the recurrent parent, Swarna. In this study, we carried out detailed morphological, biochemical and molecular characterization of these lines in a quest to identify the probable reasons for their abnormal phenotype. Nucleotide blast analysis with the primer sequences used to amplify the transgene revealed that the integration of transgene disrupted the native OsAux1 gene, which codes for an auxin transmembrane transporter protein. Real time expression analysis of the transgenes (ZmPsy and CrtI) driven by endosperm-specific promoter revealed the leaky expression of the transgene in the vegetative tissues. We propose that the disruption of OsAux1 disturbed the fine balance of plant growth regulators viz., auxins, gibberellic acid and abscisic acid, leading to the abnormalities in the growth and development of the lines homozygous for the transgene. The study demonstrates the conserved roles of OsAux1 gene in rice and Arabidopsis.

  13. Temporal Coding of Volumetric Imagery

    Science.gov (United States)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  14. WPC's Short Range Forecast Coded Bulletin

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Short Range Forecast Coded Bulletin. The Short Range Forecast Coded Bulletin describes the expected locations of high and low pressure centers, surface frontal...

  15. The FLUKA Code: Description And Benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Battistoni, Giuseppe; Muraro, S.; Sala, Paola R.; /INFN, Milan; Cerutti, Fabio; Ferrari, A.; Roesler, Stefan; /CERN; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2007-09-18

    The physics model implemented inside the FLUKA code are briefly described, with emphasis on hadronic interactions. Examples of the capabilities of the code are presented including basic (thin target) and complex benchmarks.

  16. Coding Theory, Cryptography and Related Areas

    DEFF Research Database (Denmark)

    Buchmann, Johannes; Stichtenoth, Henning; Tapia-Recillas, Horacio

    Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998......Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998...

  17. Content layer progressive coding of digital maps

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Ole Riis

    2000-01-01

    A new lossless context based method is presented for content progressive coding of limited bits/pixel images, such as maps, company logos, etc., common on the WWW. Progressive encoding is achieved by separating the image into content layers based on other predefined information. Information from...... already coded layers are used when coding subsequent layers. This approach is combined with efficient template based context bi-level coding, context collapsing methods for multi-level images and arithmetic coding. Relative pixel patterns are used to collapse contexts. The number of contexts are analyzed....... The new methods outperform existing coding schemes coding digital maps and in addition provide progressive coding. Compared to the state-of-the-art PWC coder, the compressed size is reduced to 60-70% on our layered test images....

  18. Coding for Single-Line Transmission

    Science.gov (United States)

    Madison, L. G.

    1983-01-01

    Digital transmission code combines data and clock signals into single waveform. MADCODE needs four standard integrated circuits in generator and converter plus five small discrete components. MADCODE allows simple coding and decoding for transmission of digital signals over single line.

  19. Performance of Turbo Code with Different Parameters

    Directory of Open Access Journals (Sweden)

    Samir Jasim

    2017-08-01

    Full Text Available Turbo codes are one of error correction coding where the errors which may be added into the transmission data through a communication channel can be detected and corrected, these codes provided for long codewords with decoding complexity. Turbo code is one of the concatenated codes connected in serial or in parallel for transmission data with great throughput and achieve near Shannon limit. This paper presents the performance of turbo code with different parameters such as (number of iteration, type of decoding techniques, length of code, rate, generator polynomial and type of channel get the Bit Error Rate (BER for each case, then compare the results to specify the parameters which give the optimum performance of this code. The system is simulated by using MATLAB R2016b program.

  20. Robust Reed Solomon Coded MPSK Modulation

    Directory of Open Access Journals (Sweden)

    Emir M. Husni

    2014-10-01

    Full Text Available In this paper, construction of partitioned Reed Solomon coded modulation (RSCM, which is robust for the additive white Gaussian noise channel and a Rayleigh fading channel, is investigated. By matching configuration of component codes with the channel characteristics, it is shown that this system is robust for the Gaussian and a Rayleigh fading channel. This approach is compared with non-partitioned RSCM, a Reed Solomon code combined with an MPSK signal set using Gray mapping; and block coded MPSK modulation using binary codes, Reed Muller codes. All codes use hard decision decoding algorithm. Simulation results for these schemes show that RSCM based on set partitioning performs better than those that are not based on set partitioning and Reed Muller Coded Modulation across a wide range of conditions. The novel idea here is that in the receiver, we use a rotated 2^(m+1-PSK detector if the transmitter uses a 2^m-PSK modulator.