WorldWideScience

Sample records for code gcr event-based

  1. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  2. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    Science.gov (United States)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  3. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  4. Isotopic dependence of GCR fluence behind shielding

    International Nuclear Information System (INIS)

    Cucinotta, Francis A.; Wilson, John W.; Saganti, Premkumar; Hu, Xiaodong; Kim, Myung-Hee Y.; Cleghorn, Timothy; Zeitlin, Cary; Tripathi, Ram K.

    2006-01-01

    In this paper we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR), nuclear fragmentation cross sections, and isotopic-grid on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. For the nuclear interaction data-base and transport solution, we use the quantum multiple scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, respectively. The QMSFRG model is shown to accurately describe existing fragmentation data including proper description of the odd-even effects as function of the iso-spin dependence on the projectile nucleus. The principle finding of this study is that large errors (±100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotopic-grid (∼170 ions) to ones that use a reduced isotopic-grid, for example the 59 ion-grid used in the HZETRN code in the past; however, less significant errors (<+/-20%) occur in the elemental-fluence spectra. Because a complete isotopic-grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies

  5. GCR Environmental Models I: Sensitivity Analysis for GCR Environments

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-01-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z>2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  6. Characterization of GCR-lightlike warped product of indefinite Sasakian manifolds

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar

    2014-07-01

    Full Text Available In this paper we prove that there do not exist warped product GCR-lightlike submanifolds in the form M = N⊥ × λNT such that N⊥ is an anti-invariant submanifold tangent to V and NT an invariant submanifold of M‾, other than GCR-lightlike product in an indefinite Sasakian manifold. We also obtain characterization theorems for a GCR-lightlike submanifold to be locally a GCR-lightlike warped product.

  7. GCR and SPE Radiation Effects in Materials

    Science.gov (United States)

    Waller, Jess; Rojdev, Kristina; Nichols, Charles

    2016-01-01

    This Year 3 project provides risk reduction data to assess galactic cosmic ray (GCR) and solar particle event (SPE) space radiation damage in materials used in manned low-earth orbit, lunar, interplanetary, and Martian surface missions. Long duration (up to 50 years) space radiation damage is being quantified for materials used in inflatable structures (1st priority), and space suit and habitable composite materials (2nd priority). The data collected has relevance for nonmetallic materials (polymers and composites) used in NASA missions where long duration reliability is needed in continuous or intermittent space radiation fluxes.

  8. Opening a Window on ICME Evolution and GCR Modulation During Propagation in the Innermost Heliosphere

    Science.gov (United States)

    Winslow, R. M.; Lugaz, N.; Schwadron, N.; Farrugia, C. J.; Guo, J.; Wimmer-Schweingruber, R. F.; Wilson, J. K.; Joyce, C.; Jordan, A.; Lawrence, D. J.

    2017-12-01

    We use multipoint spacecraft observations to study interplanetary coronal mass ejection (ICME) evolution and subsequent galactic cosmic ray (GCR) modulation during propagation in the inner heliosphere. We illustrate ICME propagation effects through two different case studies. The first ICME was launched from the Sun on 29 December 2011 and was observed in near-perfect longitudinal conjunction at MESSENGER and STEREO A. Despite the close longitudinal alignment, we infer from force-free field modeling that the orientation of the underlying flux rope rotated ˜80o in latitude and ˜65o in longitude. Based on both spacecraft measurements as well as ENLIL model simulations of the steady state solar wind, we find that interactions involving magnetic reconnection with corotating structures in the solar wind dramatically alter the ICME magnetic field. In particular, we observed at STEREO A a highly turbulent region with distinct properties within the flux rope that was not observed at MESSENGER; we attribute this region to interaction between the ICME and a heliospheric plasma sheet/current sheet. This is a concrete example of a sequence of events that can increase the complexity of ICMEs during propagation and should serve as a caution on using very distant observations to predict the geoeffectiveness of large interplanetary transients. Our second case study investigates changes with heliospheric distance in GCR modulation by an ICME event (launched on 12 February 2014) observed in near-conjunction at all four of the inner solar system planets. The ICME caused Forbush decreases (FDs) in the GCR count rates at Mercury (MESSENGER), Earth/Moon (ACE/LRO), and Mars (MSL). At all three locations, the pre-ICME background GCR rate was well-matched, but the depth of the FD of GCR fluxes with similar energy ranges diminished with distance from the Sun. A larger difference in FD size was observed between Mercury and Earth than between Earth and Mars, partly owing to the much larger

  9. Critical lengths of error events in convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1994-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  10. Critical Lengths of Error Events in Convolutional Codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Andersen, Jakob Dahl

    1998-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  11. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  12. A Reference Field for GCR Simulation and an LET-Based Implementation at NSRL

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.; Walker, Steven A.; Norbury, John W.

    2015-01-01

    Exposure to galactic cosmic rays (GCR) on long duration deep space missions presents a serious health risk to astronauts, with large uncertainties connected to the biological response. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment in the laboratory are discussed. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at the NASA Space Radiation Laboratory (NSRL) limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is within physical uncertainties, allowing a single reference field for deep space missions to be defined. Third, an approach for simulating the reference field at NSRL is presented. The approach allows for the linear energy transfer (LET) spectrum of the reference field to be approximately represented with discrete ion and energy beams and implicitly maintains a reasonably accurate charge spectrum (or, average quality factor). Drawbacks of the proposed methodology are discussed and weighed against alternative simulation strategies. The neutron component and track structure characteristics of the proposed strategy are discussed in this context.

  13. OpenQ∗D simulation code for QCD+QED

    DEFF Research Database (Denmark)

    Campos, Isabel; Fritzsch, Patrick; Hansen, Martin

    2018-01-01

    The openQ∗D code for the simulation of QCD+QED with C∗ boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion....... An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/....

  14. Solar Energetic Particles (SEP) and Galactic Cosmic Rays (GCR) as tracers of solar wind conditions near Saturn: Event lists and applications

    Science.gov (United States)

    Roussos, E.; Jackman, C. M.; Thomsen, M. F.; Kurth, W. S.; Badman, S. V.; Paranicas, C.; Kollmann, P.; Krupp, N.; Bučík, R.; Mitchell, D. G.; Krimigis, S. M.; Hamilton, D. C.; Radioti, A.

    2018-01-01

    The lack of an upstream solar wind monitor poses a major challenge to any study that investigates the influence of the solar wind on the configuration and the dynamics of Saturn's magnetosphere. Here we show how Cassini MIMI/LEMMS observations of Solar Energetic Particle (SEP) and Galactic Cosmic Ray (GCR) transients, that are both linked to energetic processes in the heliosphere such us Interplanetary Coronal Mass Ejections (ICMEs) and Corotating Interaction Regions (CIRs), can be used to trace enhanced solar wind conditions at Saturn's distance. SEP protons can be easily distinguished from magnetospheric ions, particularly at the MeV energy range. Many SEPs are also accompanied by strong GCR Forbush Decreases. GCRs are detectable as a low count-rate noise signal in a large number of LEMMS channels. As SEPs and GCRs can easily penetrate into the outer and middle magnetosphere, they can be monitored continuously, even when Cassini is not situated in the solar wind. A survey of the MIMI/LEMMS dataset between 2004 and 2016 resulted in the identification of 46 SEP events. Most events last more than two weeks and have their lowest occurrence rate around the extended solar minimum between 2008 and 2010, suggesting that they are associated to ICMEs rather than CIRs, which are the main source of activity during the declining phase and the minimum of the solar cycle. We also list of 17 time periods ( > 50 days each) where GCRs show a clear solar periodicity ( ∼ 13 or 26 days). The 13-day period that derives from two CIRs per solar rotation dominates over the 26-day period in only one of the 17 cases catalogued. This interval belongs to the second half of 2008 when expansions of Saturn's electron radiation belts were previously reported to show a similar periodicity. That observation not only links the variability of Saturn's electron belts to solar wind processes, but also indicates that the source of the observed periodicity in GCRs may be local. In this case GCR

  15. Preliminary Sensitivity Study on Gas-Cooled Reactor for NHDD System Using MARS-GCR

    International Nuclear Information System (INIS)

    Lee, Seung Wook; Jeong, Jae Jun; Lee, Won Jae

    2005-01-01

    A Gas-Cooled Reactor (GCR) is considered as one of the most outstanding tools for a massive hydrogen production without CO 2 emission. Till now, two types of GCR are regarded as a viable nuclear reactor for a hydrogen production: Prismatic Modular Reactor (PMR), Pebble Bed Reactor (PBR). In this paper, a preliminary sensitivity study on two types of GCR is carried out by using MARS-GCR to find out the effect on the peak fuel and reactor pressure vessel (RPV) temperature, with varying the condition of a reactor inlet, outlet temperature, and system pressure for both PMR and PBR

  16. Trehalose, glycogen and ethanol metabolism in the gcr1 mutant of Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Seker, Tamay; Hamamci, H.

    2003-01-01

    Since Gcr1p is pivotal in controlling the transcription of glycolytic enzymes and trehalose metabolism seems to be one of the control points of glycolysis, we examined trehalose and glycogen synthesis in response to 2 % glucose pulse during batch growth in gcr1 (glucose regulation-1) mutant lacking...... fully functional glycolytic pathway and in the wild-type strain. An increase in both trehalose and glycogen stores was observed 1 and 2 h after the pulse followed by a steady decrease in both the wild-type and the gcr1 mutant. The accumulation was faster while the following degradation was slower in gcr......1 cells compared to wild-type ones. Although there was no distinct glucose consumption in the mutant cells it seemed that the glucose repression mechanism is similar in gcr1 mutant and in wild-type strain at least with respect to trehalose and glycogen metabolism....

  17. Evaluation of abrasion of a modified drainage mixture with rubber waste crushed (GCR

    Directory of Open Access Journals (Sweden)

    Yee Wan Yung Vargas

    2017-02-01

    Conclusion: The results showed that there is a highlighted influence of mix temperature (between asphalt and GCR and compaction temperature (modified asphalt and aggregate on the behavior of the MD modified with GCR.

  18. GCR flux 9-day variations with LISA Pathfinder

    International Nuclear Information System (INIS)

    Grimani, C; Benella, S; Fabi, M; Finetti, N; Telloni, D

    2017-01-01

    Galactic cosmic-ray (GCR) energy spectra in the heliosphere vary on the basis of the level of solar activity, the status of solar polarity and interplanetary transient magnetic structures of solar origin. A high counting rate particle detector (PD) aboard LISA Pathfinder (LPF) allows for the measurement of galactic cosmic-ray and solar energetic particle (SEP) integral fluxes at energies > 70 MeV n −1 up to 6500 counts s −1 . Data are gathered with a sampling time of 15 s. A study of GCR flux depressions associated with the third harmonic of the Sun rotation period (∼ 9 days) is presented here. (paper)

  19. Simulation of a gas cooled reactor with the system code CATHARE

    International Nuclear Information System (INIS)

    Bentivoglio, Fabrice; Ruby, Alain; Geffraye, Genevieve; Messie, Anne; Saez, Manuel; Tauveron, Nicolas; Widlund, Ola

    2006-01-01

    In recent years the CEA has commissioned a wide range of feasibility studies of future advanced nuclear reactors, in particular gas-cooled reactors (GCR). This paper presents an overview of the use of the thermohydraulics code CATHARE in these activities. Extensively validated and qualified for pressurized water reactors, CATHARE has been adapted to deal also with gas-cooled reactor applications. Rather than branching off a separate GCR version of CATHARE, new features have been integrated as independent options in the standard version of the code, respecting the same stringent procedures for documentation and maintenance. CATHARE has evolved into an efficient tool for GCR applications, with first results in good agreement with existing experimental data and other codes. The paper give an example among the studies already carried out with CATHARE with the case of the Very High Temperature Reactor (VHTR) concepts. Current and future activities for experimental validation of CATHARE for GCR applications are also discussed. Short-term validation activities are also included with the assessment of the German utility Oberhausen II. For the long term, CEA has initiated an ambitious experimental program ranging from small scale loops for physical correlations to component technology and system demonstration loops. (authors)

  20. NOAA Weather Radio - EAS Event Codes

    Science.gov (United States)

    Non-Zero All Hazards Logo Emergency Alert Description Event Codes Fact Sheet FAQ Organization Search Coding Using SAME SAME Non-Zero Codes DOCUMENTS NWR Poster NWR Brochure NWR Brochure Printing Notes

  1. Opening a Window on ICME-driven GCR Modulation in the Inner Solar System

    Science.gov (United States)

    Winslow, Reka M.; Schwadron, Nathan A.; Lugaz, Noé; Guo, Jingnan; Joyce, Colin J.; Jordan, Andrew P.; Wilson, Jody K.; Spence, Harlan E.; Lawrence, David J.; Wimmer-Schweingruber, Robert F.; Mays, M. Leila

    2018-04-01

    Interplanetary coronal mass ejections (ICMEs) often cause Forbush decreases (Fds) in the flux of galactic cosmic rays (GCRs). We investigate how a single ICME, launched from the Sun on 2014 February 12, affected GCR fluxes at Mercury, Earth, and Mars. We use GCR observations from MESSENGER at Mercury, ACE/LRO at the Earth/Moon, and MSL at Mars. We find that Fds are steeper and deeper closer to the Sun, and that the magnitude of the magnetic field in the ICME magnetic ejecta as well as the “strength” of the ICME sheath both play a large role in modulating the depth of the Fd. Based on our results, we hypothesize that (1) the Fd size decreases exponentially with heliocentric distance, and (2) that two-step Fds are more common closer to the Sun. Both hypotheses will be directly verifiable by the upcoming Parker Solar Probe and Solar Orbiter missions. This investigation provides the first systematic study of the changes in GCR modulation as a function of distance from the Sun using nearly contemporaneous observations at Mercury, Earth/Moon, and Mars, which will be critical for validating our physical understanding of the modulation process throughout the heliosphere.

  2. The GCR2 gene family is not required for ABA control of seed germination and early seedling development in Arabidopsis.

    Directory of Open Access Journals (Sweden)

    Jianjun Guo

    Full Text Available BACKGROUND: The plant hormone abscisic acid (ABA regulates diverse processes of plant growth and development. It has recently been proposed that GCR2 functions as a G-protein-coupled receptor (GPCR for ABA. However, the structural relationships and functionality of GCR2 have been challenged by several independent studies. A central question in this controversy is whether gcr2 mutants are insensitive to ABA, because gcr2 mutants were shown to display reduced sensitivity to ABA under one experimental condition (e.g. 22 degrees C, continuous white light with 150 micromol m(-2 s(-1 but were shown to display wild-type sensitivity under another slightly different condition (e.g. 23 degrees C, 14/10 hr photoperiod with 120 micromol m(-2 s(-1. It has been hypothesized that gcr2 appears only weakly insensitive to ABA because two other GCR2-like genes in Arabidopsis, GCL1 and GCL2, compensate for the loss of function of GCR2. PRINCIPAL FINDINGS: In order to test this hypothesis, we isolated a putative loss-of-function allele of GCL2, and then generated all possible combinations of mutations in each member of the GCR2 gene family. We found that all double mutants, including gcr2 gcl1, gcr2 gcl2, gcl1 gcl2, as well as the gcr2 gcl1 gcl2 triple mutant displayed wild-type sensitivity to ABA in seed germination and early seedling development assays, demonstrating that the GCR2 gene family is not required for ABA responses in these processes. CONCLUSION: These results provide compelling genetic evidence that GCR2 is unlikely to act as a receptor for ABA in the context of either seed germination or early seedling development.

  3. Model for GCR-particle fluxes in stony meteorites and production rates of cosmogenic nuclides

    International Nuclear Information System (INIS)

    Reedy, R.C.

    1984-01-01

    A model is presented for the differential fluxes of galactic-cosmic-ray (GCR) particles with energies above 1 MeV inside any spherical stony meteorite as a function of the meteorite's radius and the sample's depth. This model is based on the Reedy-Arnold equations for the energy-dependent fluxes of GCR particles in the moon and is an extension of flux parameters that were derived for several meteorites of various sizes. This flux is used to calculate the production rates of many cosmogenic nuclides as a function of radius and depth. The peak production rates for most nuclides made by the reactions of energetic GCR particles occur near the centers of meteorites with radii of 40 to 70 g cm -2 . Although the model has some limitations, it reproduces well the basic trends for the depth-dependent production of cosmogenic nuclides in stony meteorites of various radii. These production profiles agree fairly well with measurements of cosmogenic nuclides in meteorites. Some of these production profiles are different than those calculated by others. The chemical dependence of the production rates for several nuclides varies with size and depth. 25 references, 8 figures

  4. GCR1, a transcriptional activator in Saccharomyces cerevisiae, complexes with RAP1 and can function without its DNA binding domain.

    Science.gov (United States)

    Tornow, J; Zeng, X; Gao, W; Santangelo, G M

    1993-01-01

    In Saccharomyces cerevisiae, efficient expression of glycolytic and translational component genes requires two DNA binding proteins, RAP1 (which binds to UASRPG) and GCR1 (which binds to the CT box). We generated deletions in GCR1 to test the validity of several different models for GCR1 function. We report here that the C-terminal half of GCR1, which includes the domain required for DNA binding to the CT box in vitro, can be removed without affecting GCR1-dependent transcription of either the glycolytic gene ADH1 or the translational component genes TEF1 and TEF2. We have also identified an activation domain within a segment of the GCR1 protein (the N-terminal third) that is essential for in vivo function. RAP1 and GCR1 can be co-immunoprecipitated from whole cell extracts, suggesting that they form a complex in vivo. The data are most consistent with a model in which GCR1 is attracted to DNA through contact with RAP1. Images PMID:8508768

  5. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  6. Simulations of GCR interactions within planetary bodies using GEANT4

    Science.gov (United States)

    Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.

    2017-12-01

    On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05

  7. openQ*D simulation code for QCD+QED

    Science.gov (United States)

    Campos, Isabel; Fritzsch, Patrick; Hansen, Martin; Krstić Marinković, Marina; Patella, Agostino; Ramos, Alberto; Tantalo, Nazario

    2018-03-01

    The openQ*D code for the simulation of QCD+QED with C* boundary conditions is presented. This code is based on openQCD-1.6, from which it inherits the core features that ensure its efficiency: the locally-deflated SAP-preconditioned GCR solver, the twisted-mass frequency splitting of the fermion action, the multilevel integrator, the 4th order OMF integrator, the SSE/AVX intrinsics, etc. The photon field is treated as fully dynamical and C* boundary conditions can be chosen in the spatial directions. We discuss the main features of openQ*D, and we show basic test results and performance analysis. An alpha version of this code is publicly available and can be downloaded from http://rcstar.web.cern.ch/.

  8. AUTOET code (a code for automatically constructing event trees and displaying subsystem interdependencies)

    International Nuclear Information System (INIS)

    Wilson, J.R.; Burdick, G.R.

    1977-06-01

    This is a user's manual for AUTOET I and II. AUTOET I is a computer code for automatic event tree construction. It is designed to incorporate and display subsystem interdependencies and common or key component dependencies in the event tree format. The code is written in FORTRAN IV for the CDC Cyber 76 using the Integrated Graphics System (IGS). AUTOET II incorporates consequence and risk calculations, in addition to some other refinements. 5 figures

  9. The UK MK III GCR experimental physics programme at AEE Winfrith

    Energy Technology Data Exchange (ETDEWEB)

    Johnstone, I

    1972-06-15

    The UK programme of reactor physics experiments in support of the Mk III GCR project started in 1968/69 and has now reached its third main phase. The overall programme is broadly summarised in this report.

  10. Sequence Coding and Search System for licensee event reports: code listings. Volume 2

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. Volume 2 contains all valid and acceptable codes used for searching and encoding the LER data. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 2

  11. Validation of the 3D finite element transport theory code EVENT for shielding applications

    International Nuclear Information System (INIS)

    Warner, Paul; Oliveira, R.E. de

    2000-01-01

    This paper is concerned with the validation of the 3D deterministic neutral-particle transport theory code EVENT for shielding applications. The code is based on the finite element-spherical harmonics (FE-P N ) method which has been extensively developed over the last decade. A general multi-group, anisotropic scattering formalism enables the code to address realistic steady state and time dependent, multi-dimensional coupled neutron/gamma radiation transport problems involving high scattering and deep penetration alike. The powerful geometrical flexibility and competitive computational effort makes the code an attractive tool for shielding applications. In recognition of this, EVENT is currently in the process of being adopted by the UK nuclear industry. The theory behind EVENT is described and its numerical implementation is outlined. Numerical results obtained by the code are compared with predictions of the Monte Carlo code MCBEND and also with the results from benchmark shielding experiments. In particular, results are presented for the ASPIS experimental configuration for both neutron and gamma ray calculations using the BUGLE 96 nuclear data library. (author)

  12. Unigenic Evolution: A Novel Genetic Method Localizes a Putative Leucine Zipper That Mediates Dimerization of the Saccharomyces Cerevisiae Regulator Gcr1p

    Science.gov (United States)

    Deminoff, S. J.; Tornow, J.; Santangelo, G. M.

    1995-01-01

    The GCR1 gene of Saccharomyces cerevisiae encodes a transcriptional activator that complexes with Rap1p and, through UAS(RPG) elements (Rap1p DNA binding sites), stimulates efficient expression of glycolytic and translational component genes. To map the functionally important domains in Gcr1p, we combined multiple rounds of random mutagenesis in vitro with in vivo selection of functional genes to locate conserved, or hypomutable, regions. We name this method unigenic evolution, a statistical analysis of mutations in evolutionary variants of a single gene in an otherwise isogenic background. Examination of the distribution of 315 mutations in 24 variant alleles allowed the localization of four hypomutable regions in GCR1 (A, B, C, and D). Dispensable N-terminal (intronic) and C-terminal portions of the evolved region of GCR1 were included in the analysis as controls and were, as expected, not hypomutable. The analysis of several insertion, deletion, and point mutations, combined with a comparison of the hypomutability and hydrophobicity plots of Gcr1p, suggested that some of the hypomutable regions may individually or in combination correspond to functionally important surface domains. In particular, we determined that region D contains a putative leucine zipper and is necessary and sufficient for Gcr1p homodimerization. PMID:8601472

  13. EDF's (Electricite de France) in service control for GCR type reactor vessels

    International Nuclear Information System (INIS)

    Douillet, M.G.

    1979-01-01

    This paper presents the performance of the data acquisition and processing systems developed by the French EDF for controlling and testing the mechanical properties (thermal stress, deformations, cracks,...) of prestressed concrete vessels for GCR type reactors

  14. Efficient transcription of the glycolytic gene ADH1 and three translational component genes requires the GCR1 product, which can act through TUF/GRF/RAP binding sites.

    OpenAIRE

    Santangelo, G M; Tornow, J

    1990-01-01

    Glycolytic gene expression in Saccharomyces cerevisiae is thought to be activated by the GCR and TUF proteins. We tested the hypothesis that GCR function is mediated by TUF/GRF/RAP binding sites (UASRPG elements). We found that UASRPG-dependent activation of a heterologous gene and transcription of ADH1, TEF1, TEF2, and RP59 were sensitive to GCR1 disruption. GCR is not required for TUF/GRF/RAP expression or in vitro DNA-binding activity.

  15. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    Science.gov (United States)

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD  0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  16. Efficient transcription of the glycolytic gene ADH1 and three translational component genes requires the GCR1 product, which can act through TUF/GRF/RAP binding sites.

    Science.gov (United States)

    Santangelo, G M; Tornow, J

    1990-01-01

    Glycolytic gene expression in Saccharomyces cerevisiae is thought to be activated by the GCR and TUF proteins. We tested the hypothesis that GCR function is mediated by TUF/GRF/RAP binding sites (UASRPG elements). We found that UASRPG-dependent activation of a heterologous gene and transcription of ADH1, TEF1, TEF2, and RP59 were sensitive to GCR1 disruption. GCR is not required for TUF/GRF/RAP expression or in vitro DNA-binding activity. Images PMID:2405258

  17. Constitutive Modeling of the Flow Stress of GCr15 Continuous Casting Bloom in the Heavy Reduction Process

    Science.gov (United States)

    Ji, Cheng; Wang, Zilin; Wu, Chenhui; Zhu, Miaoyong

    2018-04-01

    According to the calculation results of a 3D thermomechanical-coupled finite-element (FE) model of GCr15 bearing steel bloom during a heavy reduction (HR) process, the variation ranges in the strain rate and strain under HR were described. In addition, the hot deformation behavior of the GCr15 bearing steel was studied over the temperature range from 1023 K to 1573 K (750 °C to 1300 °C) with strain rates of 0.001, 0.01, and 0.1 s-1 in single-pass thermosimulation compression experiments. To ensure the accuracy of the constitutive model, the temperature range was divided into two temperature intervals according to the fully austenitic temperature of GCr15 steel [1173 K (900 °C)]. Two sets of material parameters for the constitutive model were derived based on the true stress-strain curves of the two temperature intervals. A flow stress constitutive model was established using a revised Arrhenius-type constitutive equation, which considers the relationships among the material parameters and true strain. This equation describes dynamic softening during hot compression processes. Considering the effect of glide and climb on the deformation mechanism, the Arrhenius-type constitutive equation was modified by a physically based approach. This model is the most accurate over the temperatures ranging from 1173 K to 1573 K (900 °C to 1300 °C) under HR deformation conditions (ignoring the range from 1273 K to 1573 K (1000 °C to 1300 °C) with a strain rate of 0.1 s-1). To ensure the convergence of the FE calculation, an approximated method was used to estimate the flow stress at temperatures greater than 1573 K (1300 °C).

  18. Challenges in coding adverse events in clinical trials: a systematic review.

    Directory of Open Access Journals (Sweden)

    Jeppe Bennekou Schroll

    Full Text Available BACKGROUND: Misclassification of adverse events in clinical trials can sometimes have serious consequences. Therefore, each of the many steps involved, from a patient's adverse experience to presentation in tables in publications, should be as standardised as possible, minimising the scope for interpretation. Adverse events are categorised by a predefined dictionary, e.g. MedDRA, which is updated biannually with many new categories. The objective of this paper is to study interobserver variation and other challenges of coding. METHODS: Systematic review using PRISMA. We searched PubMed, EMBASE and The Cochrane Library. All studies were screened for eligibility by two authors. RESULTS: Our search returned 520 unique studies of which 12 were included. Only one study investigated interobserver variation. It reported that 12% of the codes were evaluated differently by two coders. Independent physicians found that 8% of all the codes deviated from the original description. Other studies found that product summaries could be greatly affected by the choice of dictionary. With the introduction of MedDRA, it seems to have become harder to identify adverse events statistically because each code is divided in subgroups. To account for this, lumping techniques have been developed but are rarely used, and guidance on when to use them is vague. An additional challenge is that adverse events are censored if they already occurred in the run-in period of a trial. As there are more than 26 ways of determining whether an event has already occurred, this can lead to bias, particularly because data analysis is rarely performed blindly. CONCLUSION: There is a lack of evidence that coding of adverse events is a reliable, unbiased and reproducible process. The increase in categories has made detecting adverse events harder, potentially compromising safety. It is crucial that readers of medical publications are aware of these challenges. Comprehensive interobserver

  19. Elemental GCR Observations during the 2009-2010 Solar Minimum Period

    Science.gov (United States)

    Lave, K. A.; Israel, M. H.; Binns, W. R.; Christian, E. R.; Cummings, A. C.; Davis, A. J.; deNolfo, G. A.; Leske, R. A.; Mewaldt, R. A.; Stone, E. C.; hide

    2013-01-01

    Using observations from the Cosmic Ray Isotope Spectrometer (CRIS) onboard the Advanced Composition Explorer (ACE), we present new measurements of the galactic cosmic ray (GCR) elemental composition and energy spectra for the species B through Ni in the energy range approx. 50-550 MeV/nucleon during the record setting 2009-2010 solar minimum period. These data are compared with our observations from the 1997-1998 solar minimum period, when solar modulation in the heliosphere was somewhat higher. For these species, we find that the intensities during the 2009-2010 solar minimum were approx. 20% higher than those in the previous solar minimum, and in fact were the highest GCR intensities recorded during the space age. Relative abundances for these species during the two solar minimum periods differed by small but statistically significant amounts, which are attributed to the combination of spectral shape differences between primary and secondary GCRs in the interstellar medium and differences between the levels of solar modulation in the two solar minima. We also present the secondary-to-primary ratios B/C and (Sc+Ti+V)/Fe for both solar minimum periods, and demonstrate that these ratios are reasonably well fit by a simple "leaky-box" galactic transport model that is combined with a spherically symmetric solar modulation model.

  20. Using machine-coded event data for the micro-level study of political violence

    Directory of Open Access Journals (Sweden)

    Jesse Hammond

    2014-07-01

    Full Text Available Machine-coded datasets likely represent the future of event data analysis. We assess the use of one of these datasets—Global Database of Events, Language and Tone (GDELT—for the micro-level study of political violence by comparing it to two hand-coded conflict event datasets. Our findings indicate that GDELT should be used with caution for geo-spatial analyses at the subnational level: its overall correlation with hand-coded data is mediocre, and at the local level major issues of geographic bias exist in how events are reported. Overall, our findings suggest that due to these issues, researchers studying local conflict processes may want to wait for a more reliable geocoding method before relying too heavily on this set of machine-coded data.

  1. Transient Cosmic-ray Events beyond the Heliopause: Interpreting Voyager-1 Observations

    Energy Technology Data Exchange (ETDEWEB)

    Kóta, J.; Jokipii, J. R. [Lunar and Planetary Laboratory, University of Arizona, Tucson, AZ 85721-0092 (United States)

    2017-04-20

    In 2013 March and 2014 May, Voyager-1 ( V1 ) experienced small but significant increases in the flux of galactic cosmic rays (GCRs) in the hundred MeV/n range. Additionally, V1 also saw episodic depletion of GCR flux around perpendicular pitch angles. We discuss the pitch-angle distribution and the time profiles of these events. In a previous paper, we interpreted the 2013 “bump” as the GCRs remotely sensing a shock that reached the magnetic field line passing through V1 : particles gained energy as they were reflected on the approaching region of the stronger magnetic field of the disturbance. Here, we point out that energy gain is not restricted to reflected particles—GCRs passing through the disturbance also gain energy. The effect should be present in a broad range of pitch angles with the maximum increase of GCR intensity predicted to occur at the critical reflection angle. In this paper, the shock is not step-like, but a gradual increase of the magnetic field strength, B , taking a few days, in agreement with V1 measurements. This smoothens the profile of the predicted bump in the GCR flux. We also address the linear episodic decreases seen around perpendicular pitch angles. These events are interpreted in terms of adiabatic cooling behind the shock due to the slow weakening of B . We present simple numerical model calculations and find that a gradual shock followed by a slow decrease of B , as observed, may account for both the episodic increases and the anisotropic depletion of GCR fluxes.

  2. Identifying Adverse Events Using International Classification of Diseases, Tenth Revision Y Codes in Korea: A Cross-sectional Study

    Directory of Open Access Journals (Sweden)

    Minsu Ock

    2018-01-01

    Full Text Available Objectives The use of administrative data is an affordable alternative to conducting a difficult large-scale medical-record review to estimate the scale of adverse events. We identified adverse events from 2002 to 2013 on the national level in Korea, using International Classification of Diseases, tenth revision (ICD-10 Y codes. Methods We used data from the National Health Insurance Service-National Sample Cohort (NHIS-NSC. We relied on medical treatment databases to extract information on ICD-10 Y codes from each participant in the NHIS-NSC. We classified adverse events in the ICD-10 Y codes into 6 types: those related to drugs, transfusions, and fluids; those related to vaccines and immunoglobulin; those related to surgery and procedures; those related to infections; those related to devices; and others. Results Over 12 years, a total of 20 817 adverse events were identified using ICD-10 Y codes, and the estimated total adverse event rate was 0.20%. Between 2002 and 2013, the total number of such events increased by 131.3%, from 1366 in 2002 to 3159 in 2013. The total rate increased by 103.9%, from 0.17% in 2002 to 0.35% in 2013. Events related to drugs, transfusions, and fluids were the most common (19 446, 93.4%, followed by those related to surgery and procedures (1209, 5.8% and those related to vaccines and immunoglobulin (72, 0.3%. Conclusions Based on a comparison with the results of other studies, the total adverse event rate in this study was significantly underestimated. Improving coding practices for ICD-10 Y codes is necessary to precisely monitor the scale of adverse events in Korea.

  3. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    International Nuclear Information System (INIS)

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  4. Criteria for confirming sequence periodicity identified by Fourier transform analysis: application to GCR2, a candidate plant GPCR?

    Science.gov (United States)

    Illingworth, Christopher J R; Parkes, Kevin E; Snell, Christopher R; Mullineaux, Philip M; Reynolds, Christopher A

    2008-03-01

    Methods to determine periodicity in protein sequences are useful for inferring function. Fourier transformation is one approach but care is required to ensure the periodicity is genuine. Here we have shown that empirically-derived statistical tables can be used as a measure of significance. Genuine protein sequences data rather than randomly generated sequences were used as the statistical backdrop. The method has been applied to G-protein coupled receptor (GPCR) sequences, by Fourier transformation of hydrophobicity values, codon frequencies and the extent of over-representation of codon pairs; the latter being related to translational step times. Genuine periodicity was observed in the hydrophobicity whereas the apparent periodicity (as inferred from previously reported measures) in the translation step times was not validated statistically. GCR2 has recently been proposed as the plant GPCR receptor for the hormone abscisic acid. It has homology to the Lanthionine synthetase C-like family of proteins, an observation confirmed by fold recognition. Application of the Fourier transform algorithm to the GCR2 family revealed strongly predicted seven fold periodicity in hydrophobicity, suggesting why GCR2 has been reported to be a GPCR, despite negative indications in most transmembrane prediction algorithms. The underlying multiple sequence alignment, also required for the Fourier transform analysis of periodicity, indicated that the hydrophobic regions around the 7 GXXG motifs commence near the C-terminal end of each of the 7 inner helices of the alpha-toroid and continue to the N-terminal region of the helix. The results clearly explain why GCR2 has been understandably but erroneously predicted to be a GPCR.

  5. GCR flux reconstruction during the last three centuries validated by the Ti-44 in meteorites and Be-10 in ice

    Science.gov (United States)

    Cini Castagnoli, G.; Cane, D.; Taricco, C.; Bhandari, N.

    2003-04-01

    In a previous work [1] we deduced that during prolonged minima of solar activity since 1700 the galactic cosmic rays (GCR) flux was much higher (˜2 times) respect to what we can infer from GCR modulation deduced solely by the Sunspot Number series. This flux was higher respect to what we observe in the last decades by Neutron Monitor or balloon and spacecraft-borne detectors and confirmed by the three fresh-fall meteorites that we have measured during solar cycle 22. Recently we have deduced the GCR annual mean spectra for the last 300 years [2], starting from the open solar magnetic flux proposed by Solanki et al. [3]. Utilizing the GCR flux we have calculated the 44Ti (T1/2 = 59.2 y) activity in meteorites taking into account the cross sections for its production from the main target element Fe and Ni. We compare the calculated activity with our measurements of the cosmogenic 44Ti in different chondrites fell in the period 1810-1997. The results are in close agreement both in phase and amplitude. The same procedure has been adopted for calculating the production rate of 10Be in atmosphere. Normalizing to the concentration in ice in the solar cycles 20 and 21 we obtain a good agreement with the 10Be profile in Dye3 core [4]. These results demonstrate that our inference of the GCR flux in the past 300 years is reliable. [1] Bonino G., Cini Castagnoli G., Bhandari N., Taricco C., textit {Science}, 270, 1648, 1995 [2] Bonino G., Cini Castagnoli G., Cane D., Taricco C. and Bhandari N., textit {Proc. XXVII Intern. Cosmic Ray Conf.} (Hamburg, 2001) 3769-3772. [3] Solanki S.K., Schüssler M. and Fligge M.,Nature, 408, 445, 2000 [4] Beer J. et al., private communication

  6. Energetic particles in the heliosphere and GCR modulation: Reviewing of SH-posters

    International Nuclear Information System (INIS)

    Struminsky, Alexei

    2013-01-01

    This rapporteur paper addresses the SH poster session titled 'Energetic particles in the heliosphere (solar and anomalous CRs, GCR modulation)' of the 23rd European Cosmic Ray Symposium (ECRS) and the 32nd Russian Cosmic Ray Conference (RCRC). The 65 posters presented are tentatively divided into five sections: Instruments and Methods; Solar Energetic Particles; Short Term Variations; Long Term Variations; Heliosphere.

  7. Model for spatial synthesis of automated control system of the GCR type reactor; Model za prostornu sintezu sistema automatskog upravljanja reaktora GCR tipa

    Energy Technology Data Exchange (ETDEWEB)

    Lazarevic, B; Matausek, M [Institut za nuklearne nauke ' Boris Kidric' , Vinca, Belgrade (Yugoslavia)

    1966-07-01

    This paper describes the model which was developed for synthesis of spatial distribution of automated control elements in the reactor. It represents a general reliable mathematical model for analyzing transition states and synthesis of the automated control and regulation systems of GCR type reactors. One-dimensional system was defined under assumption that the time dependence of parameters of the neutron diffusion equation are identical in the total volume of the reactor and that spatial distribution of neutrons is time independent. It is shown that this assumption is satisfactory in case of short term variations which are relevant for safety analysis.

  8. In vitro manganese-dependent cross-talk between Streptococcus mutans VicK and GcrR: implications for overlapping stress response pathways.

    Directory of Open Access Journals (Sweden)

    Jennifer S Downey

    Full Text Available Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR to modulate stress response pathways.

  9. A 3D Monte Carlo model of radiation affecting cells, and its application to neuronal cells and GCR irradiation

    Science.gov (United States)

    Ponomarev, Artem; Sundaresan, Alamelu; Kim, Angela; Vazquez, Marcelo E.; Guida, Peter; Kim, Myung-Hee; Cucinotta, Francis A.

    A 3D Monte Carlo model of radiation transport in matter is applied to study the effect of heavy ion radiation on human neuronal cells. Central nervous system effects, including cognitive impairment, are suspected from the heavy ion component of galactic cosmic radiation (GCR) during space missions. The model can count, for instance, the number of direct hits from ions, which will have the most affect on the cells. For comparison, the remote hits, which are received through δ-rays from the projectile traversing space outside the volume of the cell, are also simulated and their contribution is estimated. To simulate tissue effects from irradiation, cellular matrices of neuronal cells, which were derived from confocal microscopy, were simulated in our model. To produce this realistic model of the brain tissue, image segmentation was used to identify cells in the images of cells cultures. The segmented cells were inserted pixel by pixel into the modeled physical space, which represents a volume of interacting cells with periodic boundary conditions (PBCs). PBCs were used to extrapolate the model results to the macroscopic tissue structures. Specific spatial patterns for cell apoptosis are expected from GCR, as heavy ions produce concentrated damage along their trajectories. The apoptotic cell patterns were modeled based on the action cross sections for apoptosis, which were estimated from the available experimental data. The cell patterns were characterized with an autocorrelation function, which values are higher for non-random cell patterns, and the values of the autocorrelation function were compared for X rays and Fe ion irradiations. The autocorrelation function indicates the directionality effects present in apoptotic neuronal cells from GCR.

  10. Comparison of CREME (cosmic-ray effects on microelectronics) model LET (linear energy transfer) spaceflight dosimetry data

    Energy Technology Data Exchange (ETDEWEB)

    Letaw, J.R.; Adams, J.H.

    1986-07-15

    The galactic cosmic radiation (GCR) component of space radiation is the dominant cause of single-event phenomena in microelectronic circuits when Earth's magnetic shielding is low. Spaceflights outside the magnetosphere and in high inclination orbits are examples of such circumstances. In high-inclination orbits, low-energy (high LET) particles are transmitted through the field only at extreme latitudes, but can dominate the orbit-averaged dose. GCR is an important part of the radiation dose to astronauts under the same conditions. As a test of the CREME environmental model and particle transport codes used to estimate single event upsets, we have compiled existing measurements of HZE doses were compiled where GCR is expected to be important: Apollo 16 and 17, Skylab, Apollo Soyuz Test Project, and Kosmos 782. The LET spectra, due to direct ionization from GCR, for each of these missions has been estimated. The resulting comparisons with data validate the CREME model predictions of high-LET galactic cosmic-ray fluxes to within a factor of two. Some systematic differences between the model and data are identified.

  11. Wind Farm Grid Integration Using VSC Based HVDC Transmission - An Overview

    DEFF Research Database (Denmark)

    Chaudhary, Sanjay Kumar; Teodorescu, Remus; Rodriguez, Pedro

    2008-01-01

    The paper gives an overview of HVAC and HVDC connection of wind farm to the grid, with an emphasis on Voltage Source Converter (VSC)-based HVDC for large wind farms requiring long distance cable connection. Flexible control capabilities of a VSC-based HVDC system enables smooth integration of wind...... farm into the power grid network while meeting the Grid Code Requirements (GCR). Operation of a wind farm with VSC-based HVDC connection is described....

  12. Licensee Event Report sequence coding and search procedure workshop

    International Nuclear Information System (INIS)

    Cottrell, W.B.; Gallaher, R.B.

    1981-01-01

    Since mid-1980, the Office for Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC) has been developing procedures for the systematic review and analysis of Licensee Event Reports (LERs). These procedures generally address several areas of concern, including identification of significant trends and patterns, event sequence of occurrences, component failures, and system and plant effects. The AEOD and NSIC conducted a workshop on the new coding procedure at the American Museum of Science and Energy in Oak Ridge, TN, on November 24, 1980

  13. a model for quantity estimation for multi-coded team events

    African Journals Online (AJOL)

    Participation in multi-coded sports events often involves travel to international ... Medication use by Team south africa during the XXVIIIth olympiad: a model .... individual sports included in the programme (e.g. athletes involved in contact sports ...

  14. Lexicons, contexts, events, and images: commentary on Elman (2009) from the perspective of dual coding theory.

    Science.gov (United States)

    Paivio, Allan; Sadoski, Mark

    2011-01-01

    Elman (2009) proposed that the traditional role of the mental lexicon in language processing can largely be replaced by a theoretical model of schematic event knowledge founded on dynamic context-dependent variables. We evaluate Elman's approach and propose an alternative view, based on dual coding theory and evidence that modality-specific cognitive representations contribute strongly to word meaning and language performance across diverse contexts which also have effects predictable from dual coding theory. Copyright © 2010 Cognitive Science Society, Inc.

  15. Experimental data bases useful for quantification of model uncertainties in best estimate codes

    International Nuclear Information System (INIS)

    Wilson, G.E.; Katsma, K.R.; Jacobson, J.L.; Boodry, K.S.

    1988-01-01

    A data base is necessary for assessment of thermal hydraulic codes within the context of the new NRC ECCS Rule. Separate effect tests examine particular phenomena that may be used to develop and/or verify models and constitutive relationships in the code. Integral tests are used to demonstrate the capability of codes to model global characteristics and sequence of events for real or hypothetical transients. The nuclear industry has developed a large experimental data base of fundamental nuclear, thermal-hydraulic phenomena for code validation. Given a particular scenario, and recognizing the scenario's important phenomena, selected information from this data base may be used to demonstrate applicability of a particular code to simulate the scenario and to determine code model uncertainties. LBLOCA experimental data bases useful to this objective are identified in this paper. 2 tabs

  16. Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6

    Science.gov (United States)

    Ratliff, Hunter N.; Smith, Michael B. R.; Heilbronn, Lawrence

    2017-08-01

    The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 μGy/day while RAD measured 233 μGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 μSv/day while RAD reported 710 μSv/day.

  17. TRANSIENT GALACTIC COSMIC-RAY MODULATION DURING SOLAR CYCLE 24: A COMPARATIVE STUDY OF TWO PROMINENT FORBUSH DECREASE EVENTS

    International Nuclear Information System (INIS)

    Zhao, L.-L.; Zhang, H.

    2016-01-01

    Forbush decrease (FD) events are of great interest for transient galactic cosmic-ray (GCR) modulation study. In this study, we perform comparative analysis of two prominent Forbush events during cycle 24, occurring on 2012 March 8 (Event 1) and 2015 June 22 (Event 2), utilizing the measurements from the worldwide neutron monitor (NM) network. Despite their comparable magnitudes, the two Forbush events are distinctly different in terms of evolving GCR energy spectrum and energy dependence of the recovery time. The recovery time of Event 1 is strongly dependent on the median energy, compared to the nearly constant recovery time of Event 2 over the studied energy range. Additionally, while the evolutions of the energy spectra during the two FD events exhibit similar variation patterns, the spectrum of Event 2 is significantly harder, especially at the time of deepest depression. These difference are essentially related to their associated solar wind disturbances. Event 1 is associated with a complicated shock-associated interplanetary coronal mass ejection (ICME) disturbance with large radial extent, probably formed by the merging of multiple shocks and transient flows, and which delivered a glancing blow to Earth. Conversely, Event 2 is accompanied by a relatively simple halo ICME with small radial extent that hit Earth more head-on.

  18. TRANSIENT GALACTIC COSMIC-RAY MODULATION DURING SOLAR CYCLE 24: A COMPARATIVE STUDY OF TWO PROMINENT FORBUSH DECREASE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, L.-L.; Zhang, H., E-mail: zhaolingling@ucas.edu.cn [Key Laboratory of Computational Geodynamics, University of Chinese Academy of Sciences, Beijing 100049 (China)

    2016-08-10

    Forbush decrease (FD) events are of great interest for transient galactic cosmic-ray (GCR) modulation study. In this study, we perform comparative analysis of two prominent Forbush events during cycle 24, occurring on 2012 March 8 (Event 1) and 2015 June 22 (Event 2), utilizing the measurements from the worldwide neutron monitor (NM) network. Despite their comparable magnitudes, the two Forbush events are distinctly different in terms of evolving GCR energy spectrum and energy dependence of the recovery time. The recovery time of Event 1 is strongly dependent on the median energy, compared to the nearly constant recovery time of Event 2 over the studied energy range. Additionally, while the evolutions of the energy spectra during the two FD events exhibit similar variation patterns, the spectrum of Event 2 is significantly harder, especially at the time of deepest depression. These difference are essentially related to their associated solar wind disturbances. Event 1 is associated with a complicated shock-associated interplanetary coronal mass ejection (ICME) disturbance with large radial extent, probably formed by the merging of multiple shocks and transient flows, and which delivered a glancing blow to Earth. Conversely, Event 2 is accompanied by a relatively simple halo ICME with small radial extent that hit Earth more head-on.

  19. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    International Nuclear Information System (INIS)

    1995-01-01

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization

  20. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-31

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization.

  1. Model-Based Speech Signal Coding Using Optimized Temporal Decomposition for Storage and Broadcasting Applications

    Science.gov (United States)

    Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret

    2003-12-01

    A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.

  2. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  3. Parallelization of the unstructured Navier-stoke solver LILAC for the aero-thermal analysis of a gas-cooled reactor

    International Nuclear Information System (INIS)

    Kim, J. T.; Kim, S. B.; Lee, W. J.

    2004-01-01

    Currently lilac code is under development to analyse thermo-hydraulics of the gas-cooled reactor(GCR) especially high-temperature GCR which is one of the gen IV nuclear reactors. The lilac code was originally developed for the analysis of thermo-hydraulics in a molten pool. And now it is modified to resolve the compressible gas flows in the GCR. The more complexities in the internal flow geometries of the GCR reactor and aero-thermal flows, the number of computational cells are increased and finally exceeds the current computing powers of the desktop computers. To overcome the problem and well resolve the interesting physics in the GCR it is conducted to parallels the lilac code by the decomposition of a computational domain or grid. Some benchmark problems are solved with the parallelized lilac code and its speed-up characteristics by the parallel computation is evaluated and described in the article

  4. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  5. Coding of adverse events of suicidality in clinical study reports of duloxetine for the treatment of major depressive disorder: descriptive study

    OpenAIRE

    Maund, Emma; Tendal, Britta; Hróbjartsson, Asbjørn; Lundh, Andreas; Gøtzsche, Peter C

    2014-01-01

    Objective To assess the effects of coding and coding conventions on summaries and tabulations of adverse events data on suicidality within clinical study reports. Design Systematic electronic search for adverse events of suicidality in tables, narratives, and listings of adverse events in individual patients within clinical study reports. Where possible, for each event we extracted the original term reported by the investigator, the term as coded by the medical coding dictionary, medical codi...

  6. Accident sequence precursor events with age-related contributors

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, G.A.; Kohn, W.E.

    1995-12-31

    The Accident Sequence Precursor (ASP) Program at ORNL analyzed about 14.000 Licensee Event Reports (LERs) filed by US nuclear power plants 1987--1993. There were 193 events identified as precursors to potential severe core accident sequences. These are reported in G/CR-4674. Volumes 7 through 20. Under the NRC Nuclear Plant Aging Research program, the authors evaluated these events to determine the extent to which component aging played a role. Events were selected that involved age-related equipment degradation that initiated an event or contributed to an event sequence. For the 7-year period, ORNL identified 36 events that involved aging degradation as a contributor to an ASP event. Except for 1992, the percentage of age-related events within the total number of ASP events over the 7-year period ({approximately}19%) appears fairly consistent up to 1991. No correlation between plant ape and number of precursor events was found. A summary list of the age-related events is presented in the report.

  7. Coding of adverse events of suicidality in clinical study reports of duloxetine for the treatment of major depressive disorder: descriptive study.

    Science.gov (United States)

    Maund, Emma; Tendal, Britta; Hróbjartsson, Asbjørn; Lundh, Andreas; Gøtzsche, Peter C

    2014-06-04

    To assess the effects of coding and coding conventions on summaries and tabulations of adverse events data on suicidality within clinical study reports. Systematic electronic search for adverse events of suicidality in tables, narratives, and listings of adverse events in individual patients within clinical study reports. Where possible, for each event we extracted the original term reported by the investigator, the term as coded by the medical coding dictionary, medical coding dictionary used, and the patient's trial identification number. Using the patient's trial identification number, we attempted to reconcile data on the same event between the different formats for presenting data on adverse events within the clinical study report. 9 randomised placebo controlled trials of duloxetine for major depressive disorder submitted to the European Medicines Agency for marketing approval. Clinical study reports obtained from the EMA in 2011. Six trials used the medical coding dictionary COSTART (Coding Symbols for a Thesaurus of Adverse Reaction Terms) and three used MedDRA (Medical Dictionary for Regulatory Activities). Suicides were clearly identifiable in all formats of adverse event data in clinical study reports. Suicide attempts presented in tables included both definitive and provisional diagnoses. Suicidal ideation and preparatory behaviour were obscured in some tables owing to the lack of specificity of the medical coding dictionary, especially COSTART. Furthermore, we found one event of suicidal ideation described in narrative text that was absent from tables and adverse event listings of individual patients. The reason for this is unclear, but may be due to the coding conventions used. Data on adverse events in tables in clinical study reports may not accurately represent the underlying patient data because of the medical dictionaries and coding conventions used. In clinical study reports, the listings of adverse events for individual patients and narratives

  8. Use of the Spine Adverse Events Severity System (SAVES) in patients with traumatic spinal cord injury. A comparison with institutional ICD-10 coding for the identification of acute care adverse events.

    Science.gov (United States)

    Street, J T; Thorogood, N P; Cheung, A; Noonan, V K; Chen, J; Fisher, C G; Dvorak, M F

    2013-06-01

    Observational cohort comparison. To compare the previously validated Spine Adverse Events Severity system (SAVES) with International Classification of Diseases, Tenth Revision codes (ICD-10) codes for identifying adverse events (AEs) in patients with traumatic spinal cord injury (TSCI). Quaternary Care Spine Program. Patients discharged between 2006 and 2010 were identified from our prospective registry. Two consecutive cohorts were created based on the system used to record acute care AEs; one used ICD-10 coding by hospital coders and the other used SAVES data prospectively collected by a multidisciplinary clinical team. The ICD-10 codes were appropriately mapped to the SAVES. There were 212 patients in the ICD-10 cohort and 173 patients in the SAVES cohort. Analyses were adjusted to account for the different sample sizes, and the two cohorts were comparable based on age, gender and motor score. The SAVES system identified twice as many AEs per person as ICD-10 coding. Fifteen unique AEs were more reliably identified using SAVES, including neuropathic pain (32 × more; Ppatient age and severity of paralysis were more reliably correlated to AEs collected through SAVES than ICD-10. Implementation of the SAVES system for patients with TSCI captured more individuals experiencing AEs and more AEs per person compared with ICD-10 codes. This study demonstrates the utility of prospectively collecting AE data using validated tools.

  9. Coding of adverse events of suicidality in clinical study reports of duloxetine for the treatment of major depressive disorder

    DEFF Research Database (Denmark)

    Maund, Emma; Tendal, Britta; Hróbjartsson, Asbjørn

    2014-01-01

    OBJECTIVE: To assess the effects of coding and coding conventions on summaries and tabulations of adverse events data on suicidality within clinical study reports. DESIGN: Systematic electronic search for adverse events of suicidality in tables, narratives, and listings of adverse events...... identification number, we attempted to reconcile data on the same event between the different formats for presenting data on adverse events within the clinical study report. SETTING: 9 randomised placebo controlled trials of duloxetine for major depressive disorder submitted to the European Medicines Agency...... for marketing approval. DATA SOURCES: Clinical study reports obtained from the EMA in 2011. RESULTS: Six trials used the medical coding dictionary COSTART (Coding Symbols for a Thesaurus of Adverse Reaction Terms) and three used MedDRA (Medical Dictionary for Regulatory Activities). Suicides were clearly...

  10. Fuel element thermo-mechanical analysis during transient events using the FMS and FETMA codes

    International Nuclear Information System (INIS)

    Hernandez Lopez Hector; Hernandez Martinez Jose Luis; Ortiz Villafuerte Javier

    2005-01-01

    In the Instituto Nacional de Investigaciones Nucleares of Mexico, the Fuel Management System (FMS) software package has been used for long time to simulate the operation of a BWR nuclear power plant in steady state, as well as in transient events. To evaluate the fuel element thermo-mechanical performance during transient events, an interface between the FMS codes and our own Fuel Element Thermo Mechanical Analysis (FETMA) code is currently being developed and implemented. In this work, the results of the thermo-mechanical behavior of fuel rods in the hot channel during the simulation of transient events of a BWR nuclear power plant are shown. The transient events considered for this work are a load rejection and a feedwater control failure, which among the most important events that can occur in a BWR. The results showed that conditions leading to fuel rod failure at no time appeared for both events. Also, it is shown that a transient due load rejection is more demanding on terms of safety that the failure of a controller of the feedwater. (authors)

  11. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.

    Science.gov (United States)

    Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).

  12. Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6.

    Science.gov (United States)

    Ratliff, Hunter N; Smith, Michael B R; Heilbronn, Lawrence

    2017-08-01

    The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 µGy/day while RAD measured 233 µGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 µSv/day while RAD reported 710 µSv/day. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  13. Validation of a CFD code for Unsteady Flows with cyclic boundary Conditions

    International Nuclear Information System (INIS)

    Kim, Jong-Tae; Kim, Sang-Baik; Lee, Won-Jae

    2006-01-01

    Currently Lilac code is under development to analyze thermo-hydraulics of a high-temperature gas-cooled reactor (GCR). Interesting thermo-hydraulic phenomena in a nuclear reactor are usually unsteady and turbulent. The analysis of the unsteady flows by using a three dimension CFD code is time-consuming if the flow domain is very large. Hopefully, flow domains commonly encountered in the nuclear thermo-hydraulics is periodic. So it is better to use the geometrical characteristics in order to reduce the computational resources. To get the benefits from reducing the computation domains especially for the calculations of unsteady flows, the cyclic boundary conditions are implemented in the parallelized CFD code LILAC. In this study, the parallelized cyclic boundary conditions are validated by solving unsteady laminar and turbulent flows past a circular cylinder

  14. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  15. Heterogeneous but “Standard” Coding Systems for Adverse Events: Issues in Achieving Interoperability between Apples and Oranges

    Science.gov (United States)

    Richesson, Rachel L.; Fung, Kin Wah; Krischer, Jeffrey P.

    2008-01-01

    Monitoring adverse events (AEs) is an important part of clinical research and a crucial target for data standards. The representation of adverse events themselves requires the use of controlled vocabularies with thousands of needed clinical concepts. Several data standards for adverse events currently exist, each with a strong user base. The structure and features of these current adverse event data standards (including terminologies and classifications) are different, so comparisons and evaluations are not straightforward, nor are strategies for their harmonization. Three different data standards - the Medical Dictionary for Regulatory Activities (MedDRA) and the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) terminologies, and Common Terminology Criteria for Adverse Events (CTCAE) classification - are explored as candidate representations for AEs. This paper describes the structural features of each coding system, their content and relationship to the Unified Medical Language System (UMLS), and unsettled issues for future interoperability of these standards. PMID:18406213

  16. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    Science.gov (United States)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    acute exposures of the mixed field beams used for the experiments. The chromosomes were simulated by a polymer random walk algorithm with restrictions to their respective domains in the nucleus [1]. The stochastic dose to the nucleus was calculated with the code RITRACKS [2]. Irradiation of a target volume by a mixed field of ions was implemented within RITRACKs, and the fields of ions can be delivered over specific periods of time, allowing the simulation of dose-rate effects. Similarly, particles of various types and energies extracted from a pre-calculated spectra of galactic cosmic rays (GCR) can be used in RITRACKS. The number and spatial location of DSBs (DNA double-strand breaks) were calculated in BDSTRACKS using the simulated chromosomes and local (voxel) dose. Assuming that DSBs led to chromosome breaks, and simulating the rejoining of damaged chromosomes occurring during repair, BDSTRACKS produces the yield of various types of chromosome aberrations as a function of time (only final yields are presented). A comparison between experimental and simulation results will be shown.

  17. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    Science.gov (United States)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  18. Sparsey^TM: Spatiotemporal Event Recognition via Deep Hierarchical Sparse Distributed Codes

    Directory of Open Access Journals (Sweden)

    Gerard J Rinkus

    2014-12-01

    Full Text Available The visual cortex’s hierarchical, multi-level organization is captured in many biologically inspired computational vision models, the general idea being that progressively larger scale (spatially/temporally and more complex visual features are represented in progressively higher areas. However, most earlier models use localist representations (codes in each representational field (which we equate with the cortical macrocolumn, mac, at each level. In localism, each represented feature/concept/event (hereinafter item is coded by a single unit. The model we describe, Sparsey, is hierarchical as well but crucially, it uses sparse distributed coding (SDC in every mac in all levels. In SDC, each represented item is coded by a small subset of the mac’s units. The SDCs of different items can overlap and the size of overlap between items can be used to represent their similarity. The difference between localism and SDC is crucial because SDC allows the two essential operations of associative memory, storing a new item and retrieving the best-matching stored item, to be done in fixed time for the life of the model. Since the model’s core algorithm, which does both storage and retrieval (inference, makes a single pass over all macs on each time step, the overall model’s storage/retrieval operation is also fixed-time, a criterion we consider essential for scalability to the huge (Big Data problems. A 2010 paper described a non-hierarchical version of this model in the context of purely spatial pattern processing. Here, we elaborate a fully hierarchical model (arbitrary numbers of levels and macs per level, describing novel model principles like progressive critical periods, dynamic modulation of principal cells’ activation functions based on a mac-level familiarity measure, representation of multiple simultaneously active hypotheses, a novel method of time warp invariant recognition, and we report results showing learning/recognition of

  19. Modelling of plate-out under gas-cooled reactor (GCR) accident conditions

    International Nuclear Information System (INIS)

    Taig, A.R.

    1981-01-01

    The importance of plate-out in mitigating consequences of gas-cooled reactor accidents, and its place in assessing these consequences, are discussed. The data requirements of a plate-out modelling program are discussed, and a brief description is given of parallel work programs on thermal/hydraulic reactor behaviour and fuel modelling, both of which will provide inputs to the plate-out program under development. The representation of a GCR system used in SRD studies is presented, and the equations governing iodine adsorption, desorption and transport round the circuit are derived. The status of SRD's plate-out program is described, and the type of sensitivity studies to be undertaken with the partially-developed computer program in order to identify the most useful lines for future research is discussed. (author)

  20. Microstructure of warm rolling and pearlitic transformation of ultrafine-grained GCr15 steel

    International Nuclear Information System (INIS)

    Sun, Jun-Jie; Lian, Fu-Liang; Liu, Hong-Ji; Jiang, Tao; Guo, Sheng-Wu; Du, Lin-Xiu; Liu, Yong-Ning

    2014-01-01

    Pearlitic transformation mechanisms have been investigated in ultra-fine grained GCr15 steel. The ultrafine-grained steel, whose grain size was less than 1 μm, was prepared by thermo-mechanical treatment at 873 K and then annealing at 923 K for 2 h. Pearlitic transformation was conducted by reheating the ultra-fine grained samples at 1073 K and 1123 K for different periods of time and then cooling in air. Scanning electron microscope observation shows that normal lamellar pearlite, instead of granular cementite and ferrite, cannot be formed when the grain size is approximately less than 4(± 0.6) μm, which yields a critical grain size for normal lamellar pearlitic transformations in this chromium alloyed steel. The result confirms that grain size has a great influence on pearlitic transformation by increasing the diffusion rate of carbon atoms in the ultra-fine grained steel, and the addition of chromium element doesn't change this pearlitic phase transformation rule. Meanwhile, the grain growth rate is reduced by chromium alloying, which is beneficial to form fine grains during austenitizing, thus it facilitating pearlitic transformation by divorced eutectoid transformation. Moreover, chromium element can form a relatively high gradient in the frontier of the undissolved carbide, which promotes carbide formation in the frontier of the undissolved carbide, i.e., chromium promotes divorced eutectoid transformation. - Highlights: • Ultrafine-grained GCr15 steel was obtained by warm rolling and annealing technology. • Reduction of grain size makes pearlite morphology from lamellar to granular. • Adding Cr does not change normal pearlitic phase transformation rule in UFG steel. • Cr carbide resists grain growth and facilitates pearlitic transformation by DET

  1. Gas-cooled reactor thermal-hydraulics using CAST3M and CRONOS2 codes

    International Nuclear Information System (INIS)

    Studer, E.; Coulon, N.; Stietel, A.; Damian, F.; Golfier, H.; Raepsaet, X.

    2003-01-01

    The CEA R and D program on advanced Gas Cooled Reactors (GCR) relies on different concepts: modular High Temperature Reactor (HTR), its evolution dedicated to hydrogen production (Very High Temperature Reactor) and Gas Cooled Fast Reactors (GCFR). Some key safety questions are related to decay heat removal during potential accident. This is strongly connected to passive natural convection (including gas injection of Helium, CO 2 , Nitrogen or Argon) or forced convection using active safety systems (gas blowers, heat exchangers). To support this effort, thermal-hydraulics computer codes will be necessary tools to design, enhance the performance and ensure a high safety level of the different reactors. Accurate and efficient modeling of heat transfer by conduction, convection or thermal radiation as well as energy storage are necessary requirements to obtain a high level of confidence in the thermal-hydraulic simulations. To achieve that goal a thorough validation process has to ve conducted. CEA's CAST3M code dedicated to GCR thermal-hydraulics has been validated against different test cases: academic interaction between natural convection and thermal radiation, small scale in-house THERCE experiments and large scale High Temperature Test Reactor benchmarks such as HTTR-VC benchmark. Coupling with neutronics is also an important modeling aspect for the determination of neutronic parameters such as neutronic coefficient (Doppler, moderator,...), critical position of control rods...CEA's CAST3M and CRONOS2 computer codes allow this coupling and a first example of coupled thermal-hydraulics/neutronics calculations has been performed. Comparison with experimental data will be the next step with High Temperature Test Reactor experimental results at nominal power

  2. Sequence Coding and Search System for licensee event reports: user's guide. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Greene, N.M.; Mays, G.T.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. This four volume report documents and describes SCSS in detail. Volume 1 is a User's Guide for searching the SCSS database. This volume contains updated material through February 1985 of the working version of ORNL/NSIC-223, Vol. 1

  3. Inner heliosphere spatial gradients of GCR protons and alpha particles in the low GeV range

    Science.gov (United States)

    Gieseler, J.; Boezio, M.; Casolino, M.; De Simone, N.; Di Felice, V.; Heber, B.; Martucci, M.; Picozza, P.

    2013-12-01

    The spacecraft Ulysses was launched in October 1990 in the maximum phase of solar cycle 22, reached its final, highly inclined (80.2°) Keplerian orbit around the Sun in February 1992, and was finally switched off in June 2009. The Kiel Electron Telescope (KET) aboard Ulysses measures electrons from 3 MeV to a few GeV and protons and helium in the energy range from 6 MeV/nucleon to above 2 GeV/nucleon. In order to investigate the radial and latitudinal gradients of galactic cosmic rays (GCR), it is essential to know their intensity variations for a stationary observer in the heliosphere because the Ulysses measurements reflect not only the spatial but also the temporal variation of the energetic particle intensities. This was accomplished in the past with the Interplanetary Monitoring Platform-J (IMP 8) until it was lost in 2006. Fortunately, the satellite-borne experiment PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) was launched in June 2006 and can be used as a reliable 1 AU baseline for measurements of the KET aboard Ulysses. With these tools at hand, we have the opportunity to determine the spatial gradients of GCR protons and alpha particles at about 0.1 to 1 GeV/n in the inner heliosphere during the extended minimum of solar cycle 23. We then compare these A0 cycle.

  4. Coding of adverse events of suicidality in clinical study reports of duloxetine for the treatment of major depressive disorder

    DEFF Research Database (Denmark)

    Maund, Emma; Tendal, Britta; Hróbjartsson, Asbjørn

    2014-01-01

    in individual patients within clinical study reports. Where possible, for each event we extracted the original term reported by the investigator, the term as coded by the medical coding dictionary, medical coding dictionary used, and the patient's trial identification number. Using the patient's trial...... for marketing approval. DATA SOURCES: Clinical study reports obtained from the EMA in 2011. RESULTS: Six trials used the medical coding dictionary COSTART (Coding Symbols for a Thesaurus of Adverse Reaction Terms) and three used MedDRA (Medical Dictionary for Regulatory Activities). Suicides were clearly...... identifiable in all formats of adverse event data in clinical study reports. Suicide attempts presented in tables included both definitive and provisional diagnoses. Suicidal ideation and preparatory behaviour were obscured in some tables owing to the lack of specificity of the medical coding dictionary...

  5. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  6. Sequence Coding and Search System for licensee event reports: coder's manual. Volume 4

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This four volume report documents and describes SCSS in detail. Volume 3 and 4 provide a technical processor, new to SCSS, the information and methodology necessary to capture descriptive data from the LER and to codify that data into a structured format and serve as reference material for the more experienced technical processor, and contains information that is essential for the more advanced user who needs to be familiar with the intricate coding techniques in order to retrieve specific details in a sequence. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 4

  7. Sequence Coding and Search System for licensee event reports: coder's manual. Volume 3

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This four volume report documents and describes SCSS in detail. Volumes 3 and 4 provide a technical processor, new to SCSS, the information and methodology necessary to capture descriptive data from the LER and to codify that data into a structured format and serve as reference material for the more experienced technical processor, and contains information is essential for the more advanced user who needs to be familiar with the intricate coding techniques in order to retrieve specific details in a sequence. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 3

  8. A vectorized Monte Carlo code for modeling photon transport in SPECT

    International Nuclear Information System (INIS)

    Smith, M.F.; Floyd, C.E. Jr.; Jaszczak, R.J.

    1993-01-01

    A vectorized Monte Carlo computer code has been developed for modeling photon transport in single photon emission computed tomography (SPECT). The code models photon transport in a uniform attenuating region and photon detection by a gamma camera. It is adapted from a history-based Monte Carlo code in which photon history data are stored in scalar variables and photon histories are computed sequentially. The vectorized code is written in FORTRAN77 and uses an event-based algorithm in which photon history data are stored in arrays and photon history computations are performed within DO loops. The indices of the DO loops range over the number of photon histories, and these loops may take advantage of the vector processing unit of our Stellar GS1000 computer for pipelined computations. Without the use of the vector processor the event-based code is faster than the history-based code because of numerical optimization performed during conversion to the event-based algorithm. When only the detection of unscattered photons is modeled, the event-based code executes 5.1 times faster with the use of the vector processor than without; when the detection of scattered and unscattered photons is modeled the speed increase is a factor of 2.9. Vectorization is a valuable way to increase the performance of Monte Carlo code for modeling photon transport in SPECT

  9. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  10. Space Weather Nowcasting of Atmospheric Ionizing Radiation for Aviation Safety

    Science.gov (United States)

    Mertens, Christopher J.; Wilson, John W.; Blattnig, Steve R.; Solomon, Stan C.; Wiltberger, J.; Kunches, Joseph; Kress, Brian T.; Murray, John J.

    2007-01-01

    There is a growing concern for the health and safety of commercial aircrew and passengers due to their exposure to ionizing radiation with high linear energy transfer (LET), particularly at high latitudes. The International Commission of Radiobiological Protection (ICRP), the EPA, and the FAA consider the crews of commercial aircraft as radiation workers. During solar energetic particle (SEP) events, radiation exposure can exceed annual limits, and the number of serious health effects is expected to be quite high if precautions are not taken. There is a need for a capability to monitor the real-time, global background radiations levels, from galactic cosmic rays (GCR), at commercial airline altitudes and to provide analytical input for airline operations decisions for altering flight paths and altitudes for the mitigation and reduction of radiation exposure levels during a SEP event. The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model is new initiative to provide a global, real-time radiation dosimetry package for archiving and assessing the biologically harmful radiation exposure levels at commercial airline altitudes. The NAIRAS model brings to bear the best available suite of Sun-Earth observations and models for simulating the atmospheric ionizing radiation environment. Observations are utilized from ground (neutron monitors), from the atmosphere (the METO analysis), and from space (NASA/ACE and NOAA/GOES). Atmospheric observations provide the overhead shielding information and the ground- and space-based observations provide boundary conditions on the GCR and SEP energy flux distributions for transport and dosimetry simulations. Dose rates are calculated using the parametric AIR (Atmospheric Ionizing Radiation) model and the physics-based HZETRN (High Charge and Energy Transport) code. Empirical models of the near-Earth radiation environment (GCR/SEP energy flux distributions and geomagnetic cut-off rigidity) are benchmarked

  11. HELIOS/DRAGON/NESTLE codes' simulation of the Gentilly-2 loss of class 4 power event

    International Nuclear Information System (INIS)

    Sarsour, H.N.; Turinsky, P.J.; Rahnema, F.; Mosher, S.; Serghiuta, D.; Marleau, G.; Courau, T.

    2002-01-01

    A loss of electrical power occurred at Gentilly-2 in September of 1995 while the station was operating at full power. There was an unexpectedly rapid core power increase initiated by the drainage of the zone controllers and accelerated by coolant boiling. The core transient was terminated by Shutdown System No 1 (SDS1) tripping when the out-of-core ion chambers exceeded the 10%/sec high rate of power increase trip setpoint at 1.29 sec. This resulted in the station automatically shutting down within 2 sec of event initiation. In the first 2 sec, 26 of the 58 SDS1 and SDS2 in-core flux detectors reached there overpower trip (ROPT) setpoints. The peak reactor power reached approximately 110%FP. Reference 1 presented detailed results of the simulations performed with coupled thermalhydraulics and 3D neutron kinetics codes, SOPHT-G2 and the CERBERUS module of RFSP, and the various adjustments of these codes and plant representation that were needed to obtain the neutronic response observed in 1995. The purposes of this paper are to contrast a simulation prediction of the peak prompt core thermal power transient versus experimental estimate, and to note the impact of spatial discretization approach utilized on the prompt core thermal power transient and the channel power distribution as a function of time. In addition, adequacy of the time-step sizes employed and sensitivity to core's transient thermal-hydraulics conditions are studied. The work presented in this paper has been performed as part of a project sponsored by the Canadian Nuclear Safety Commission (CNSC). The purpose of the project was to gather information and assess the accuracy of best estimate methods using calculation methods and codes developed independently from the CANDU industry. The simulation of the accident was completed using the NESTLE core simulator, employing cross sections generated by the HELIOS lattice physics code, and incremental cross sections generated by the DRAGON lattice physics code

  12. TRANSENERGY S: computer codes for coolant temperature prediction in LMFBR cores during transient events

    International Nuclear Information System (INIS)

    Glazer, S.; Todreas, N.; Rohsenow, W.; Sonin, A.

    1981-02-01

    This document is intended as a user/programmer manual for the TRANSENERGY-S computer code. The code represents an extension of the steady state ENERGY model, originally developed by E. Khan, to predict coolant and fuel pin temperatures in a single LMFBR core assembly during transient events. Effects which may be modelled in the analysis include temporal variation in gamma heating in the coolant and duct wall, rod power production, coolant inlet temperature, coolant flow rate, and thermal boundary conditions around the single assembly. Numerical formulations of energy equations in the fuel and coolant are presented, and the solution schemes and stability criteria are discussed. A detailed description of the input deck preparation is presented, as well as code logic flowcharts, and a complete program listing. TRANSENERGY-S code predictions are compared with those of two different versions of COBRA, and partial results of a 61 pin bundle test case are presented

  13. Numerical analysis on ingress-of-coolant events in fusion reactors with TRAC-PF1 code

    International Nuclear Information System (INIS)

    Ose, Yasuo; Takase, Kazuyuki; Akimoto, Hajime

    2000-01-01

    As for accident events related with thermal-hydraulics, in a fusion experimental reactor an ingress-of-coolant event (ICE) and a loss-of-vacuum-accident event (LOVA) should be considered. An integrated ICE/LOVA test apparatus is under planning in order to estimate quantitatively heat transfer and fluid flow characteristics under ICE and LOVA events. This study was carried out to predict numerically the thermal-hydraulic characteristics in fusion reactors at the ICE events before construction of the integrated ICE/LOVA test apparatus. The TRAC-PF1 code, which was originally developed for the thermal-hydraulic safety analysis in light water reactors, was used. The numerical analyses were performed for two kinds of system configuration with/without a pressure-suppression tank:the former for is investigation of the pressure rise characteristics and two-phase flow behavior; the latter for estimation of an effect of the pressure reduction due to the pressure-suppression tank. From the present analytical results, effects of the ingress water flow rate and vessel temperatures on the pressure rise ware clarified quantitatively. Furthermore, the pressure-rise suppression effect due to the vapor condensation in the pressure-suppression tank was predicted numerically. In addition, the useful information regarding to the design of the integrated ICE/LOVA test apparatus and the knowledge with respect to the effective usage of the TRAC-PF1 code were obtained through the present numerical study. (author)

  14. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  15. PC-based support programs coupled with the sets code for large fault tree analysis

    International Nuclear Information System (INIS)

    Hioki, K.; Nakai, R.

    1989-01-01

    Power Reactor and Nuclear Fuel Development Corporation (PNC) has developed four PC programs: IEIQ (Initiating Event Identification and Quantification), MODESTY (Modular Even Description for a Variety of Systems), FAUST (Fault Summary Tables Generation Program) and ETAAS (Event Tree Analysis Assistant System). These programs prepare the input data for the SETS (Set Equation Transformation System) code and construct and quantify event trees (E/Ts) using the output of the SETS code. The capability of these programs is described and some examples of the results are presented in this paper. With these PC programs and the SETS code, PSA can now be performed with more consistency and less manpower

  16. 75 FR 384 - Event Problem Codes Web Site; Center for Devices and Radiological Health; Availability

    Science.gov (United States)

    2010-01-05

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2009-N-0576] Event Problem Codes Web Site; Center for Devices and Radiological Health; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is announcing...

  17. Event localization in bulk scintillator crystals using coded apertures

    Energy Technology Data Exchange (ETDEWEB)

    Ziock, K.P. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Department of Physics and Astronomy, University of Tennessee, Knoxville, TN (United States); Braverman, J.B. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN (United States); Fabris, L.; Harrison, M.J.; Hornback, D.; Newby, J. [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2015-06-01

    The localization of radiation interactions in bulk scintillators is generally limited by the size of the light distribution at the readout surface of the crystal/light-pipe system. By finding the centroid of the light spot, which is typically of order centimeters across, practical single-event localization is limited to ~2 mm/cm of crystal thickness. Similar resolution can also be achieved for the depth of interaction by measuring the size of the light spot. Through the use of near-field coded-aperture techniques applied to the scintillation light, light transport simulations show that for 3-cm-thick crystals, more than a five-fold improvement (millimeter spatial resolution) can be achieved both laterally and in event depth. At the core of the technique is the requirement to resolve the shadow from an optical mask placed in the scintillation light path between the crystal and the readout. In this paper, experimental results are presented that demonstrate the overall concept using a 1D shadow mask, a thin-scintillator crystal and a light pipe of varying thickness to emulate a 2.2-cm-thick crystal. Spatial resolutions of ~1 mm in both depth and transverse to the readout face are obtained over most of the crystal depth.

  18. Event localization in bulk scintillator crystals using coded apertures

    International Nuclear Information System (INIS)

    Ziock, K.P.; Braverman, J.B.; Fabris, L.; Harrison, M.J.; Hornback, D.; Newby, J.

    2015-01-01

    The localization of radiation interactions in bulk scintillators is generally limited by the size of the light distribution at the readout surface of the crystal/light-pipe system. By finding the centroid of the light spot, which is typically of order centimeters across, practical single-event localization is limited to ~2 mm/cm of crystal thickness. Similar resolution can also be achieved for the depth of interaction by measuring the size of the light spot. Through the use of near-field coded-aperture techniques applied to the scintillation light, light transport simulations show that for 3-cm-thick crystals, more than a five-fold improvement (millimeter spatial resolution) can be achieved both laterally and in event depth. At the core of the technique is the requirement to resolve the shadow from an optical mask placed in the scintillation light path between the crystal and the readout. In this paper, experimental results are presented that demonstrate the overall concept using a 1D shadow mask, a thin-scintillator crystal and a light pipe of varying thickness to emulate a 2.2-cm-thick crystal. Spatial resolutions of ~1 mm in both depth and transverse to the readout face are obtained over most of the crystal depth

  19. Non-binary unitary error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.

    1996-06-01

    Error operator bases for systems of any dimension are defined and natural generalizations of the bit-flip/ sign-change error basis for qubits are given. These bases allow generalizing the construction of quantum codes based on eigenspaces of Abelian groups. As a consequence, quantum codes can be constructed form linear codes over {ital Z}{sub {ital n}} for any {ital n}. The generalization of the punctured code construction leads to many codes which permit transversal (i.e. fault tolerant) implementations of certain operations compatible with the error basis.

  20. RadWorks Storm Shelter Design for Solar Particle Event Shielding

    Science.gov (United States)

    Simon, Matthew A.; Cerro, Jeffrey; Clowdsley, Martha

    2013-01-01

    In order to enable long-duration human exploration beyond low-Earth orbit, the risks associated with exposure of astronaut crews to space radiation must be mitigated with practical and affordable solutions. The space radiation environment beyond the magnetosphere is primarily a combination of two types of radiation: galactic cosmic rays (GCR) and solar particle events (SPE). While mitigating GCR exposure remains an open issue, reducing astronaut exposure to SPEs is achievable through material shielding because they are made up primarily of medium-energy protons. In order to ensure astronaut safety for long durations beyond low-Earth orbit, SPE radiation exposure must be mitigated. However, the increasingly demanding spacecraft propulsive performance for these ambitious missions requires minimal mass and volume radiation shielding solutions which leverage available multi-functional habitat structures and logistics as much as possible. This paper describes the efforts of NASA's RadWorks Advanced Exploration Systems (AES) Project to design minimal mass SPE radiation shelter concepts leveraging available resources. Discussion items include a description of the shelter trade space, the prioritization process used to identify the four primary shelter concepts chosen for maturation, a summary of each concept's design features, a description of the radiation analysis process, and an assessment of the parasitic mass of each concept.

  1. Analysis on ingress of coolant event in vacuum vessel using modified TRAC-BF1 code

    International Nuclear Information System (INIS)

    Ajima, Toshio; Kurihara, Ryoichi; Seki, Yasushi

    1999-08-01

    The Transient Reactor Analysis Code (TRAC-BF1) was modified on the basis of ICE experimental results so as to analyze the Ingress of Coolant Event (ICE) in the vacuum vessel of a nuclear fusion reactor. In the previous report, the TRAC-BF1 code, which was originally developed for the safety analysis of a light water reactor, had been modified for the ICE of the fusion reactor. And the addition of the flat structural plate model to the VESSEL component and arbitrary appointment of the gravity direction had been added in the TRAC-BF1 code. This TRAC-BF1 code was further modified. The flat structural plate model of the VESSEL component was enabled to divide in multi layers having different materials, and a part of the multi layers could take a buried heater into consideration. Moreover, the TRAC-BF1 code was modified to analyze under the low-pressure condition close to vacuum within range of the steam table. This paper describes additional functions of the modified TRAC-BF1 code, analytical evaluation using ICE experimental data and the ITER model with final design report (FDR) data. (author)

  2. MadEvent: automatic event generation with MadGraph

    International Nuclear Information System (INIS)

    Maltoni, Fabio; Stelzer, Tim

    2003-01-01

    We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)

  3. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  4. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  5. An induction-based magnetohydrodynamic 3D code for finite magnetic Reynolds number liquid-metal flows in fusion blankets

    International Nuclear Information System (INIS)

    Kawczynski, Charlie; Smolentsev, Sergey; Abdou, Mohamed

    2016-01-01

    Highlights: • A new induction-based magnetohydrodynamic code was developed using a finite difference method. • The code was benchmarked against purely hydrodynamic and MHD flows for low and finite magnetic Reynolds number. • Possible applications of the new code include liquid-metal MHD flows in the breeder blanket during unsteady events in the plasma. - Abstract: Most numerical analysis performed in the past for MHD flows in liquid-metal blankets were based on the assumption of low magnetic Reynolds number and involved numerical codes that utilized electric potential as the main electromagnetic variable. One limitation of this approach is that such codes cannot be applied to truly unsteady processes, for example, MHD flows of liquid-metal breeder/coolant during unsteady events in plasma, such as major plasma disruptions, edge-localized modes and vertical displacements, when changes in plasmas occur at millisecond timescales. Our newly developed code MOONS (Magnetohydrodynamic Object-Oriented Numerical Solver) uses the magnetic field as the main electromagnetic variable to relax the limitations of the low magnetic Reynolds number approximation for more realistic fusion reactor environments. The new code, written in Fortran, implements a 3D finite-difference method and is capable of simulating multi-material domains. The constrained transport method was implemented to evolve the magnetic field in time and assure that the magnetic field remains solenoidal within machine accuracy at every time step. Various verification tests have been performed including purely hydrodynamic flows and MHD flows at low and finite magnetic Reynolds numbers. Test results have demonstrated very good accuracy against known analytic solutions and other numerical data.

  6. An induction-based magnetohydrodynamic 3D code for finite magnetic Reynolds number liquid-metal flows in fusion blankets

    Energy Technology Data Exchange (ETDEWEB)

    Kawczynski, Charlie; Smolentsev, Sergey, E-mail: sergey@fusion.ucla.edu; Abdou, Mohamed

    2016-11-01

    Highlights: • A new induction-based magnetohydrodynamic code was developed using a finite difference method. • The code was benchmarked against purely hydrodynamic and MHD flows for low and finite magnetic Reynolds number. • Possible applications of the new code include liquid-metal MHD flows in the breeder blanket during unsteady events in the plasma. - Abstract: Most numerical analysis performed in the past for MHD flows in liquid-metal blankets were based on the assumption of low magnetic Reynolds number and involved numerical codes that utilized electric potential as the main electromagnetic variable. One limitation of this approach is that such codes cannot be applied to truly unsteady processes, for example, MHD flows of liquid-metal breeder/coolant during unsteady events in plasma, such as major plasma disruptions, edge-localized modes and vertical displacements, when changes in plasmas occur at millisecond timescales. Our newly developed code MOONS (Magnetohydrodynamic Object-Oriented Numerical Solver) uses the magnetic field as the main electromagnetic variable to relax the limitations of the low magnetic Reynolds number approximation for more realistic fusion reactor environments. The new code, written in Fortran, implements a 3D finite-difference method and is capable of simulating multi-material domains. The constrained transport method was implemented to evolve the magnetic field in time and assure that the magnetic field remains solenoidal within machine accuracy at every time step. Various verification tests have been performed including purely hydrodynamic flows and MHD flows at low and finite magnetic Reynolds numbers. Test results have demonstrated very good accuracy against known analytic solutions and other numerical data.

  7. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  8. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  9. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through the analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.

  10. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  11. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  12. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  13. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  14. The effects of microhardnesses and friction coefficients of GCr15 and Cr4Mo4V bearing materials by ion implantation

    International Nuclear Information System (INIS)

    Yang Qifa; Xiang Deguang; Lu Haolin

    1988-01-01

    Some experimental results of microhardnesses and friction coefficients of GCr15 and Cr4Mo4V bearing materials which were implanted with Cr, Mo, N and B ions are reported in this paper. It is found that the microhardnesses are increased and the friction coefficients are reduced by Cr, Mo, N and B ion implantation for two materials. The friction coefficients of Cr + Mo + N , Cr + Mo + B ion implanted samples are reduced to 1/3 of the unimplanted samples

  15. Analysis of an ADS spurious opening event at a BWR/6 by means of the TRACE code

    International Nuclear Information System (INIS)

    Nikitin, Konstantin; Manera, Annalisa

    2011-01-01

    Highlights: → The spurious opening of 8 relief valves of the ADS system in a BWR/6 has been simulated. → The valves opening results in a fast depressurization and significant loads on the RPV internals. → This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the available plant data. - Abstract: The paper presents the results of a post-event analysis of a spurious opening of 8 relief valves of the automatic depressurization system (ADS) occurred in a BWR/6. The opening of the relief valves results in a fast depressurization (pressure blow down) of the primary system which might lead to significant dynamic loads on the RPV and associated internals. In addition, the RPV level swelling caused by the fast depressurization might lead to undesired water carry-over into the steam line and through the safety relief valves (SRVs). Therefore, the transient needs to be characterized in terms of evolution of pressure, temperature and fluid distribution in the system. This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the plant data.

  16. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  17. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  18. Non-Binary Protograph-Based LDPC Codes: Analysis,Enumerators and Designs

    OpenAIRE

    Sun, Yizeng

    2013-01-01

    Non-binary LDPC codes can outperform binary LDPC codes using sum-product algorithm with higher computation complexity. Non-binary LDPC codes based on protographs have the advantage of simple hardware architecture. In the first part of this thesis, we will use EXIT chart analysis to compute the thresholds of different protographs over GF(q). Based on threshold computation, some non-binary protograph-based LDPC codes are designed and their frame error rates are compared with binary LDPC codes. ...

  19. Four year-olds use norm-based coding for face identity.

    Science.gov (United States)

    Jeffery, Linda; Read, Ainsley; Rhodes, Gillian

    2013-05-01

    Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged children also use norm-based coding. We reasoned that the transition to school could be critical in developing a norm-based system because school places new demands on children's face identification skills and substantially increases experience with faces. Consistent with this view, face identification performance improves steeply between ages 4 and 7. We used face identity aftereffects to test whether norm-based coding emerges between these ages. We found that 4 year-old children, like adults, showed larger face identity aftereffects for adaptors far from the average than for adaptors closer to the average, consistent with use of norm-based coding. We conclude that experience prior to age 4 is sufficient to develop a norm-based face-space and that failure to use norm-based coding cannot explain 4 year-old children's poor face identification skills. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Investigations on femtosecond laser modified micro-textured surface with anti-friction property on bearing steel GCr15

    Science.gov (United States)

    Yang, Lijun; Ding, Ye; Cheng, Bai; He, Jiangtao; Wang, Genwang; Wang, Yang

    2018-03-01

    This work puts forward femtosecond laser modification of micro-textured surface on bearing steel GCr15 in order to reduce frictional wear and enhance load capacity during its application. Multi pulses femtosecond laser ablation experiments are established for the confirmation of laser spot radius as well as single pulse threshold fluence and pulse incubation coefficient of bulk material. Analytical models are set up in combination with hydrodynamics lubrication theory. Corresponding simulations are carried out on to explore influences of surface and cross sectional morphology of textures on hydrodynamics lubrication effect based on Navier-Stokes (N-S) equation. Technological experiments focus on the impacts of femtosecond laser machining variables, like scanning times, scanning velocity, pulse frequency and scanning gap on morphology of grooves as well as realization of optimized textures proposed by simulations, mechanisms of which are analyzed from multiple perspectives. Results of unidirectional rotating friction tests suggest that spherical texture with depth-to-width ratio of 0.2 can significantly improve tribological properties at low loading and velocity condition comparing with un-textured and other textured surfaces, which also verifies the accuracy of simulations and feasibility of femtosecond laser in modification of micro-textured surface.

  1. WGS-based surveillance of third-generation cephalosporin-resistant Escherichia coli from bloodstream infections in Denmark

    DEFF Research Database (Denmark)

    Roer, Louise; Hansen, Frank; Thomsen, Martin Christen Frølund

    2017-01-01

    clone, here observed for the first time in Denmark. Additionally, the analysis revealed three individual cases with possible persistence of closely related clones collected more than 13 months apart. Continuous WGS-based national surveillance of 3GC-R Ec , in combination with more detailed......-genome sequenced and characterized by using the batch uploader from the Center for Genomic Epidemiology (CGE) and automatically analysed using the CGE tools according to resistance profile, MLST, serotype and fimH subtype. Additionally, the phylogenetic relationship of the isolates was analysed by SNP analysis......To evaluate a genome-based surveillance of all Danish third-generation cephalosporin-resistant Escherichia coli (3GC-R Ec ) from bloodstream infections between 2014 and 2015, focusing on horizontally transferable resistance mechanisms. A collection of 552 3GC-R Ec isolates were whole...

  2. A hybrid path-oriented code assignment CDMA-based MAC protocol for underwater acoustic sensor networks.

    Science.gov (United States)

    Chen, Huifang; Fan, Guangyu; Xie, Lei; Cui, Jun-Hong

    2013-11-04

    Due to the characteristics of underwater acoustic channel, media access control (MAC) protocols designed for underwater acoustic sensor networks (UWASNs) are quite different from those for terrestrial wireless sensor networks. Moreover, in a sink-oriented network with event information generation in a sensor field and message forwarding to the sink hop-by-hop, the sensors near the sink have to transmit more packets than those far from the sink, and then a funneling effect occurs, which leads to packet congestion, collisions and losses, especially in UWASNs with long propagation delays. An improved CDMA-based MAC protocol, named path-oriented code assignment (POCA) CDMA MAC (POCA-CDMA-MAC), is proposed for UWASNs in this paper. In the proposed MAC protocol, both the round-robin method and CDMA technology are adopted to make the sink receive packets from multiple paths simultaneously. Since the number of paths for information gathering is much less than that of nodes, the length of the spreading code used in the POCA-CDMA-MAC protocol is shorter greatly than that used in the CDMA-based protocols with transmitter-oriented code assignment (TOCA) or receiver-oriented code assignment (ROCA). Simulation results show that the proposed POCA-CDMA-MAC protocol achieves a higher network throughput and a lower end-to-end delay compared to other CDMA-based MAC protocols.

  3. A Hybrid Path-Oriented Code Assignment CDMA-Based MAC Protocol for Underwater Acoustic Sensor Networks

    Directory of Open Access Journals (Sweden)

    Huifang Chen

    2013-11-01

    Full Text Available Due to the characteristics of underwater acoustic channel, media access control (MAC protocols designed for underwater acoustic sensor networks (UWASNs are quite different from those for terrestrial wireless sensor networks. Moreover, in a sink-oriented network with event information generation in a sensor field and message forwarding to the sink hop-by-hop, the sensors near the sink have to transmit more packets than those far from the sink, and then a funneling effect occurs, which leads to packet congestion, collisions and losses, especially in UWASNs with long propagation delays. An improved CDMA-based MAC protocol, named path-oriented code assignment (POCA CDMA MAC (POCA-CDMA-MAC, is proposed for UWASNs in this paper. In the proposed MAC protocol, both the round-robin method and CDMA technology are adopted to make the sink receive packets from multiple paths simultaneously. Since the number of paths for information gathering is much less than that of nodes, the length of the spreading code used in the POCA-CDMA-MAC protocol is shorter greatly than that used in the CDMA-based protocols with transmitter-oriented code assignment (TOCA or receiver-oriented code assignment (ROCA. Simulation results show that the proposed POCA-CDMA-MAC protocol achieves a higher network throughput and a lower end-to-end delay compared to other CDMA-based MAC protocols.

  4. Temporal code in the vibrissal system-Part II: Roughness surface discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Farfan, F D [Departamento de BioingenierIa, FACET, Universidad Nacional de Tucuman, INSIBIO - CONICET, CC 327, Postal Code CP 4000 (Argentina); AlbarracIn, A L [Catedra de Neurociencias, Facultad de Medicina, Universidad Nacional de Tucuman (Argentina); Felice, C J [Departamento de BioingenierIa, FACET, Universidad Nacional de Tucuman, INSIBIO - CONICET, CC 327, Postal Code CP 4000 (Argentina)

    2007-11-15

    Previous works have purposed hypotheses about the neural code of the tactile system in the rat. One of them is based on the physical characteristics of vibrissae, such as frequency of resonance; another is based on discharge patterns on the trigeminal ganglion. In this work, the purpose is to find a temporal code analyzing the afferent signals of two vibrissal nerves while vibrissae sweep surfaces of different roughness. Two levels of pressure were used between the vibrissa and the contact surface. We analyzed the afferent discharge of DELTA and GAMMA vibrissal nerves. The vibrissae movements were produced using electrical stimulation of the facial nerve. The afferent signals were analyzed using an event detection algorithm based on Continuous Wavelet Transform (CWT). The algorithm was able to detect events of different duration. The inter-event times detected were calculated for each situation and represented in box plot. This work allowed establishing the existence of a temporal code at peripheral level.

  5. Temporal code in the vibrissal system-Part II: Roughness surface discrimination

    International Nuclear Information System (INIS)

    Farfan, F D; AlbarracIn, A L; Felice, C J

    2007-01-01

    Previous works have purposed hypotheses about the neural code of the tactile system in the rat. One of them is based on the physical characteristics of vibrissae, such as frequency of resonance; another is based on discharge patterns on the trigeminal ganglion. In this work, the purpose is to find a temporal code analyzing the afferent signals of two vibrissal nerves while vibrissae sweep surfaces of different roughness. Two levels of pressure were used between the vibrissa and the contact surface. We analyzed the afferent discharge of DELTA and GAMMA vibrissal nerves. The vibrissae movements were produced using electrical stimulation of the facial nerve. The afferent signals were analyzed using an event detection algorithm based on Continuous Wavelet Transform (CWT). The algorithm was able to detect events of different duration. The inter-event times detected were calculated for each situation and represented in box plot. This work allowed establishing the existence of a temporal code at peripheral level

  6. Cellular track model of biological damage to mammalian cell cultures from galactic cosmic rays

    International Nuclear Information System (INIS)

    Cucinotta, F.A.; Katz, R.; Wilson, J.W.; Townsend, L.W.; Nealy, J.E.; Shinn, J.L.

    1991-02-01

    The assessment of biological damage from the galactic cosmic rays (GCR) is a current interest for exploratory class space missions where the highly ionizing, high-energy, high-charge ions (HZE) particles are the major concern. The relative biological effectiveness (RBE) values determined by ground-based experiments with HZE particles are well described by a parametric track theory of cell inactivation. Using the track model and a deterministic GCR transport code, the biological damage to mammalian cell cultures is considered for 1 year in free space at solar minimum for typical spacecraft shielding. Included are the effects of projectile and target fragmentation. The RBE values for the GCR spectrum which are fluence-dependent in the track model are found to be more severe than the quality factors identified by the International Commission on Radiological Protection publication 26 and seem to obey a simple scaling law with the duration period in free space

  7. Cellular track model of biological damage to mammalian cell cultures from galactic cosmic rays

    Science.gov (United States)

    Cucinotta, Francis A.; Katz, Robert; Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Shinn, Judy L.

    1991-01-01

    The assessment of biological damage from the galactic cosmic rays (GCR) is a current interest for exploratory class space missions where the highly ionizing, high-energy, high-charge ions (HZE) particles are the major concern. The relative biological effectiveness (RBE) values determined by ground-based experiments with HZE particles are well described by a parametric track theory of cell inactivation. Using the track model and a deterministic GCR transport code, the biological damage to mammalian cell cultures is considered for 1 year in free space at solar minimum for typical spacecraft shielding. Included are the effects of projectile and target fragmentation. The RBE values for the GCR spectrum which are fluence-dependent in the track model are found to be more severe than the quality factors identified by the International Commission on Radiological Protection publication 26 and seem to obey a simple scaling law with the duration period in free space.

  8. Predictive values of diagnostic codes for identifying serious hypocalcemia and dermatologic adverse events among women with postmenopausal osteoporosis in a commercial health plan database.

    Science.gov (United States)

    Wang, Florence T; Xue, Fei; Ding, Yan; Ng, Eva; Critchlow, Cathy W; Dore, David D

    2018-04-10

    Post-marketing safety studies of medicines often rely on administrative claims databases to identify adverse outcomes following drug exposure. Valid ascertainment of outcomes is essential for accurate results. We aim to quantify the validity of diagnostic codes for serious hypocalcemia and dermatologic adverse events from insurance claims data among women with postmenopausal osteoporosis (PMO). We identified potential cases of serious hypocalcemia and dermatologic events through ICD-9 diagnosis codes among women with PMO within claims from a large US healthcare insurer (June 2005-May 2010). A physician adjudicated potential hypocalcemic and dermatologic events identified from the primary position on emergency department (ED) or inpatient claims through medical record review. Positive predictive values (PPVs) and 95% confidence intervals (CIs) quantified the fraction of potential cases that were confirmed. Among 165,729 patients with PMO, medical charts were obtained for 40 of 55 (73%) potential hypocalcemia cases; 16 were confirmed (PPV 40%, 95% CI 25-57%). The PPV was higher for ED than inpatient claims (82 vs. 24%). Among 265 potential dermatologic events (primarily urticaria or rash), we obtained 184 (69%) charts and confirmed 128 (PPV 70%, 95% CI 62-76%). The PPV was higher for ED than inpatient claims (77 vs. 39%). Diagnostic codes for hypocalcemia and dermatologic events may be sufficient to identify events giving rise to emergency care, but are less accurate for identifying events within hospitalizations.

  9. Development of the DTNTES code

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Morales Dorado, M.D.; Alonso Santos, A.

    1987-01-01

    The DTNTES code has been developed in the Department of Nuclear Technology of the Polytechnical University in Madrid as a part of the Research Program on Quantitative Risk Analysis. DTNTES code calculates several time-dependent probabilistic characteristics of basic events, minimal cut sets and the top event of a fault tree. The code assumes that basic events are statistically independent, and they have failure and repair distributions. It computes the minimal cut upper bound approach for the top event unavailability, and the time-dependent unreliability of the top event by means of different methods, selected by the user. These methods are: expected number of system failures, failure rate, Barlow-Proschan bound, steady-state upper bound, and T* method. (author)

  10. Evaluating shielding effectiveness for reducing space radiation cancer risks

    International Nuclear Information System (INIS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2006-01-01

    We discuss calculations of probability distribution functions (PDF) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPE). The PDFs are used in significance tests for evaluating the effectiveness of potential radiation shielding approaches. Uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments are considered in models of cancer risk PDFs. Competing mortality risks and functional correlations in radiation quality factor uncertainties are included in the calculations. We show that the cancer risk uncertainty, defined as the ratio of the upper value of 95% confidence interval (CI) to the point estimate is about 4-fold for lunar and Mars mission risk projections. For short-stay lunar missions ( 180d) or Mars missions, GCR risks may exceed radiation risk limits that are based on acceptable levels of risk. For example, the upper 95% CI exceeding 10% fatal risk for males and females on a Mars mission. For reducing GCR cancer risks, shielding materials are marginally effective because of the penetrating nature of GCR and secondary radiation produced in tissue by relativistic particles. At the present time, polyethylene or carbon composite shielding cannot be shown to significantly reduce risk compared to aluminum shielding based on a significance test that accounts for radiobiology uncertainties in GCR risk projection

  11. Folklore in bureaucracy code: Running a music event

    Directory of Open Access Journals (Sweden)

    Krstanović-Lukić Miroslava

    2004-01-01

    Full Text Available A music folk-created piece of work is a construction expressed as a paradigm part of a set in the bureaucracy system and the public arena. Such a work is a mechanical concept, which defines inheritance as a construction of authenticity saturated with elements of folk, national culture. It is also a subject of certain conventions in the system of regulations; namely, it is a part of the administrative code. The usage of the folk created work as a paradigm and legislations is realized through an organizational apparatus that is, it becomes entertainment, a spectacle. This paper analyzes the functioning of the organizational machinery of a folk spectacle, starting with the government authorities, local self-management and the spectacle's administrative committees. To illustrate this phenomenon, the paper presents the development of a trumpet playing festival in Dragačevo. This particular festival establishes a cultural, economic and political order with a clear and defined division of power. The analysis shows that the folk event in question, through its programs and activities, represents a scene and arena of individual and group interests. Organizational interactions are recognized in binary oppositions: sovereignty/dependency official/unofficial, dominancy/ subordination, innovative/inherited common/different, needed/useful, original/copy, one's own/belonging to someone else.

  12. Management of operational events in research reactor

    International Nuclear Information System (INIS)

    Zhong Heping; Yang Shuchun; Peng Xueming

    2001-01-01

    The author describes the tracing management process post-operational event in a research reactor based on nuclear safety code, under the background of the research reactor in Nuclear Power Institute of China. It presorts the definite measures to the event tracing and it up its management factors

  13. Validation of the thermal-hydraulic system code ATHLET based on selected pressure drop and void fraction BFBT tests

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Escalante, Javier Jimenez; Espinoza, Victor Sanchez

    2015-07-15

    Highlights: • Simulation of BFBT-BWR steady-state and transient tests with ATHLET. • Validation of thermal-hydraulic models based on pressure drops and void fraction measurements. • TRACE system code is used for the comparative study. • Predictions result in a good agreement with the experiments. • Discrepancies are smaller or comparable with respect to the measurements uncertainty. - Abstract: Validation and qualification of thermal-hydraulic system codes based on separate effect tests are essential for the reliability of numerical tools when applied to nuclear power plant analyses. To this purpose, the Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in various validation and qualification activities of different CFD, sub-channel and system codes. In this paper, the capabilities of the thermal-hydraulic code ATHLET are assessed based on the experimental results provided within the NUPEC BFBT benchmark related to key Boiling Water Reactors (BWR) phenomena. Void fraction and pressure drops measurements in the BFBT bundle performed under steady-state and transient conditions which are representative for e.g. turbine trip and recirculation pump trip events, are compared with the numerical results of ATHLET. The comparison of code predictions with the BFBT data has shown good agreement given the experimental uncertainty and the results are consistent with the trends obtained with similar thermal-hydraulic codes.

  14. Four Year-Olds Use Norm-Based Coding for Face Identity

    Science.gov (United States)

    Jeffery, Linda; Read, Ainsley; Rhodes, Gillian

    2013-01-01

    Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged…

  15. Extreme sea-level events in coastal regions

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A.S.

    that the outcome of the project has been a code that is capable of predicting correct trends more often (15 out of 20) than the other ‘black box’ codes in operation at various agencies. U. N. SINHA CSIR-Centre for Mathematical Modelling and Computer... of the extreme climate events. Their past trends, future projections and vulnerabi- lity and adaptation to such events are discussed in the report. The report was based on the efforts of both the working groups of the IPCC, WG I, which deals with the science...

  16. Breaking the Code: The Creative Use of QR Codes to Market Extension Events

    Science.gov (United States)

    Hill, Paul; Mills, Rebecca; Peterson, GaeLynn; Smith, Janet

    2013-01-01

    The use of smartphones has drastically increased in recent years, heralding an explosion in the use of QR codes. The black and white square barcodes that link the physical and digital world are everywhere. These simple codes can provide many opportunities to connect people in the physical world with many of Extension online resources. The…

  17. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  18. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, Exploration, and Human Health and Safety

    Science.gov (United States)

    Koontz, Steve

    2015-01-01

    In this presentation a review of galactic cosmic ray (GCR) effects on microelectronic systems and human health and safety is given. The methods used to evaluate and mitigate unwanted cosmic ray effects in ground-based, atmospheric flight, and space flight environments are also reviewed. However not all GCR effects are undesirable. We will also briefly review how observation and analysis of GCR interactions with planetary atmospheres and surfaces and reveal important compositional and geophysical data on earth and elsewhere. About 1000 GCR particles enter every square meter of Earth’s upper atmosphere every second, roughly the same number striking every square meter of the International Space Station (ISS) and every other low- Earth orbit spacecraft. GCR particles are high energy ionized atomic nuclei (90% protons, 9% alpha particles, 1% heavier nuclei) traveling very close to the speed of light. The GCR particle flux is even higher in interplanetary space because the geomagnetic field provides some limited magnetic shielding. Collisions of GCR particles with atomic nuclei in planetary atmospheres and/or regolith as well as spacecraft materials produce nuclear reactions and energetic/highly penetrating secondary particle showers. Three twentieth century technology developments have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems and assess effects on human health and safety effects. The key technology developments are: 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems. Space and geophysical exploration needs drove the development of the instruments and analytical tools needed to recover compositional and structural data from GCR induced nuclear reactions and secondary particle showers. Finally, the

  19. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes

    Science.gov (United States)

    Lin, Shu

    1998-01-01

    A code trellis is a graphical representation of a code, block or convolutional, in which every path represents a codeword (or a code sequence for a convolutional code). This representation makes it possible to implement Maximum Likelihood Decoding (MLD) of a code with reduced decoding complexity. The most well known trellis-based MLD algorithm is the Viterbi algorithm. The trellis representation was first introduced and used for convolutional codes [23]. This representation, together with the Viterbi decoding algorithm, has resulted in a wide range of applications of convolutional codes for error control in digital communications over the last two decades. There are two major reasons for this inactive period of research in this area. First, most coding theorists at that time believed that block codes did not have simple trellis structure like convolutional codes and maximum likelihood decoding of linear block codes using the Viterbi algorithm was practically impossible, except for very short block codes. Second, since almost all of the linear block codes are constructed algebraically or based on finite geometries, it was the belief of many coding theorists that algebraic decoding was the only way to decode these codes. These two reasons seriously hindered the development of efficient soft-decision decoding methods for linear block codes and their applications to error control in digital communications. This led to a general belief that block codes are inferior to convolutional codes and hence, that they were not useful. Chapter 2 gives a brief review of linear block codes. The goal is to provide the essential background material for the development of trellis structure and trellis-based decoding algorithms for linear block codes in the later chapters. Chapters 3 through 6 present the fundamental concepts, finite-state machine model, state space formulation, basic structural properties, state labeling, construction procedures, complexity, minimality, and

  20. Methods to Load Balance a GCR Pressure Solver Using a Stencil Framework on Multi- and Many-Core Architectures

    Directory of Open Access Journals (Sweden)

    Milosz Ciznicki

    2015-01-01

    Full Text Available The recent advent of novel multi- and many-core architectures forces application programmers to deal with hardware-specific implementation details and to be familiar with software optimisation techniques to benefit from new high-performance computing machines. Extra care must be taken for communication-intensive algorithms, which may be a bottleneck for forthcoming era of exascale computing. This paper aims to present a high-level stencil framework implemented for the EULerian or LAGrangian model (EULAG that efficiently utilises multi- and many-cores architectures. Only an efficient usage of both many-core processors (CPUs and graphics processing units (GPUs with the flexible data decomposition method can lead to the maximum performance that scales the communication-intensive Generalized Conjugate Residual (GCR elliptic solver with preconditioner.

  1. Mesh-based parallel code coupling interface

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, K.; Steckel, B. (eds.) [GMD - Forschungszentrum Informationstechnik GmbH, St. Augustin (DE). Inst. fuer Algorithmen und Wissenschaftliches Rechnen (SCAI)

    2001-04-01

    MpCCI (mesh-based parallel code coupling interface) is an interface for multidisciplinary simulations. It provides industrial end-users as well as commercial code-owners with the facility to combine different simulation tools in one environment. Thereby new solutions for multidisciplinary problems will be created. This opens new application dimensions for existent simulation tools. This Book of Abstracts gives a short overview about ongoing activities in industry and research - all presented at the 2{sup nd} MpCCI User Forum in February 2001 at GMD Sankt Augustin. (orig.) [German] MpCCI (mesh-based parallel code coupling interface) definiert eine Schnittstelle fuer multidisziplinaere Simulationsanwendungen. Sowohl industriellen Anwender als auch kommerziellen Softwarehersteller wird mit MpCCI die Moeglichkeit gegeben, Simulationswerkzeuge unterschiedlicher Disziplinen miteinander zu koppeln. Dadurch entstehen neue Loesungen fuer multidisziplinaere Problemstellungen und fuer etablierte Simulationswerkzeuge ergeben sich neue Anwendungsfelder. Dieses Book of Abstracts bietet einen Ueberblick ueber zur Zeit laufende Arbeiten in der Industrie und in der Forschung, praesentiert auf dem 2{sup nd} MpCCI User Forum im Februar 2001 an der GMD Sankt Augustin. (orig.)

  2. Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators

    Science.gov (United States)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-03-01

    We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.

  3. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  4. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  5. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    Science.gov (United States)

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  6. Coding Transparency in Object-Based Video

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    A novel algorithm for coding gray level alpha planes in object-based video is presented. The scheme is based on segmentation in multiple layers. Different coders are specifically designed for each layer. In order to reduce the bit rate, cross-layer redundancies as well as temporal correlation are...

  7. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  8. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  9. Computing eigenvalue sensitivity coefficients to nuclear data based on the CLUTCH method with RMC code

    International Nuclear Information System (INIS)

    Qiu, Yishu; She, Ding; Tang, Xiao; Wang, Kan; Liang, Jingang

    2016-01-01

    Highlights: • A new algorithm is proposed to reduce memory consumption for sensitivity analysis. • The fission matrix method is used to generate adjoint fission source distributions. • Sensitivity analysis is performed on a detailed 3D full-core benchmark with RMC. - Abstract: Recently, there is a need to develop advanced methods of computing eigenvalue sensitivity coefficients to nuclear data in the continuous-energy Monte Carlo codes. One of these methods is the iterated fission probability (IFP) method, which is adopted by most of Monte Carlo codes of having the capabilities of computing sensitivity coefficients, including the Reactor Monte Carlo code RMC. Though it is accurate theoretically, the IFP method faces the challenge of huge memory consumption. Therefore, it may sometimes produce poor sensitivity coefficients since the number of particles in each active cycle is not sufficient enough due to the limitation of computer memory capacity. In this work, two algorithms of the Contribution-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) method, namely, the collision-event-based algorithm (C-CLUTCH) which is also implemented in SCALE and the fission-event-based algorithm (F-CLUTCH) which is put forward in this work, are investigated and implemented in RMC to reduce memory requirements for computing eigenvalue sensitivity coefficients. While the C-CLUTCH algorithm requires to store concerning reaction rates of every collision, the F-CLUTCH algorithm only stores concerning reaction rates of every fission point. In addition, the fission matrix method is put forward to generate the adjoint fission source distribution for the CLUTCH method to compute sensitivity coefficients. These newly proposed approaches implemented in RMC code are verified by a SF96 lattice model and the MIT BEAVRS benchmark problem. The numerical results indicate the accuracy of the F-CLUTCH algorithm is the same as the C

  10. Image content authentication based on channel coding

    Science.gov (United States)

    Zhang, Fan; Xu, Lei

    2008-03-01

    The content authentication determines whether an image has been tampered or not, and if necessary, locate malicious alterations made on the image. Authentication on a still image or a video are motivated by recipient's interest, and its principle is that a receiver must be able to identify the source of this document reliably. Several techniques and concepts based on data hiding or steganography designed as a means for the image authentication. This paper presents a color image authentication algorithm based on convolution coding. The high bits of color digital image are coded by the convolution codes for the tamper detection and localization. The authentication messages are hidden in the low bits of image in order to keep the invisibility of authentication. All communications channels are subject to errors introduced because of additive Gaussian noise in their environment. Data perturbations cannot be eliminated but their effect can be minimized by the use of Forward Error Correction (FEC) techniques in the transmitted data stream and decoders in the receiving system that detect and correct bits in error. This paper presents a color image authentication algorithm based on convolution coding. The message of each pixel is convolution encoded with the encoder. After the process of parity check and block interleaving, the redundant bits are embedded in the image offset. The tamper can be detected and restored need not accessing the original image.

  11. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  12. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  13. US/FRG umbrella agreement for cooperation in GCR Development. Fuel, fission products, and graphite subprogram. Quarterly status report, July 1, 1982-September 30, 1982

    International Nuclear Information System (INIS)

    Turner, R.F.

    1982-10-01

    This report describes the status of the cooperative work being performed in the Fuel, Fission Product, and Graphite Subprogram under the HTR-Implementing Agreement of the United States/Federal Republic of Germany Umbrella Agreement for Cooperation in GCR Development. The status is described relative to the commitments in the Subprogram Plan for Fuel, Fission Products, and Graphite, Revision 5, April 1982. The work described was performed during the period July 1, 1982 through September 30, 1982 in the HTGR Base Technology Program at Oak Ridge National Laboratory, the HTGR Fuel and Plant Technology Programs at General Atomic Company (GA), and the Project HTR-Brennstoffkreislauf of the Entwicklungsgemeinschaft HTR at KFA Julich, HRB Mannheim, HOBEG Hanau, and SIGRI Meitingen. The requirement for and format of this quarterly status report are specified in the HTR Implementing Agreement procedures for cooperation. Responsibility for preparation of the quarterly report alternates between GA and KFA

  14. Design of Multiple Trellis-Coded Multi-h CPM Based on Super Trellis

    Directory of Open Access Journals (Sweden)

    X. Liu. A. Liu

    2012-12-01

    Full Text Available It has been shown that the multiple trellis code can perform better than the conventional trellis code over AWGN channels, at the cost of additional computations per trellis branch. Multiple trellis coded multi-h CPM schemes have been shown in the literature to have attractive power-bandwidth performance at the expense of increased receiver complexity. In this method, the multi-h format is made to be associated with the specific pattern and repeated rather than cyclically changed in time for successive symbol intervals, resulting in a longer effective length of the error event with better performance. It is well known that the rate (n-1/n multiple trellis codes combined with 2^n-level CPM have good power-bandwidth performance. In this paper, a scheme combining rate 1/2 and 2/3 multiple trellis codes with 4- and 8-level multi-h CPM is shown to have better power-bandwidth performance over the upper bound than the scheme with single-h.

  15. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  16. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  17. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  18. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Setiadipura, Topan; Obara, Toru

    2014-01-01

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower k eff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher k eff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  19. New developments in file-based infrastructure for ATLAS event selection

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D M [Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nowak, M, E-mail: gemmeren@anl.go [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2010-04-01

    In ATLAS software, TAGs are event metadata records that can be stored in various technologies, including ROOT files and relational databases. TAGs are used to identify and extract events that satisfy certain selection predicates, which can be coded as SQL-style queries. TAG collection files support in-file metadata to store information describing all events in the collection. Event Selector functionality has been augmented to provide such collection-level metadata to subsequent algorithms. The ATLAS I/O framework has been extended to allow computational processing of TAG attributes to select or reject events without reading the event data. This capability enables physicists to use more detailed selection criteria than are feasible in an SQL query. For example, the TAGs contain enough information not only to check the number of electrons, but also to calculate their distance to the closest jet-a calculation that would be difficult to express in SQL. Another new development allows ATLAS to write TAGs directly into event data files. This feature can improve performance by supporting advanced event selection capabilities, including computational processing of TAG information, without the need for external TAG file or database access.

  20. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  1. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  2. A neutron spectrum unfolding code based on iterative procedures

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    In this work, the version 3.0 of the neutron spectrum unfolding code called Neutron Spectrometry and Dosimetry from Universidad Autonoma de Zacatecas (NSDUAZ), is presented. This code was designed in a graphical interface under the LabVIEW programming environment and it is based on the iterative SPUNIT iterative algorithm, using as entrance data, only the rate counts obtained with 7 Bonner spheres based on a 6 Lil(Eu) neutron detector. The main features of the code are: it is intuitive and friendly to the user; it has a programming routine which automatically selects the initial guess spectrum by using a set of neutron spectra compiled by the International Atomic Energy Agency. Besides the neutron spectrum, this code calculates the total flux, the mean energy, H(10), h(10), 15 dosimetric quantities for radiation protection porpoises and 7 survey meter responses, in four energy grids, based on the International Atomic Energy Agency compilation. This code generates a full report in html format with all relevant information. In this work, the neutron spectrum of a 241 AmBe neutron source on air, located at 150 cm from detector, is unfolded. (Author)

  3. Computer codes for level 1 probabilistic safety assessment

    International Nuclear Information System (INIS)

    1990-06-01

    Probabilistic Safety Assessment (PSA) entails several laborious tasks suitable for computer codes assistance. This guide identifies these tasks, presents guidelines for selecting and utilizing computer codes in the conduct of the PSA tasks and for the use of PSA results in safety management and provides information on available codes suggested or applied in performing PSA in nuclear power plants. The guidance is intended for use by nuclear power plant system engineers, safety and operating personnel, and regulators. Large efforts are made today to provide PC-based software systems and PSA processed information in a way to enable their use as a safety management tool by the nuclear power plant overall management. Guidelines on the characteristics of software needed for management to prepare a software that meets their specific needs are also provided. Most of these computer codes are also applicable for PSA of other industrial facilities. The scope of this document is limited to computer codes used for the treatment of internal events. It does not address other codes available mainly for the analysis of external events (e.g. seismic analysis) flood and fire analysis. Codes discussed in the document are those used for probabilistic rather than for phenomenological modelling. It should be also appreciated that these guidelines are not intended to lead the user to selection of one specific code. They provide simply criteria for the selection. Refs and tabs

  4. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  5. Monte Carlo transport model comparison with 1A GeV accelerated iron experiment: heavy-ion shielding evaluation of NASA space flight-crew foodstuff

    Science.gov (United States)

    Stephens, D. L. Jr; Townsend, L. W.; Miller, J.; Zeitlin, C.; Heilbronn, L.

    2002-01-01

    Deep-space manned flight as a reality depends on a viable solution to the radiation problem. Both acute and chronic radiation health threats are known to exist, with solar particle events as an example of the former and galactic cosmic rays (GCR) of the latter. In this experiment Iron ions of 1A GeV are used to simulate GCR and to determine the secondary radiation field created as the GCR-like particles interact with a thick target. A NASA prepared food pantry locker was subjected to the iron beam and the secondary fluence recorded. A modified version of the Monte Carlo heavy ion transport code developed by Zeitlin at LBNL is compared with experimental fluence. The foodstuff is modeled as mixed nuts as defined by the 71st edition of the Chemical Rubber Company (CRC) Handbook of Physics and Chemistry. The results indicate a good agreement between the experimental data and the model. The agreement between model and experiment is determined using a linear fit to ordered pairs of data. The intercept is forced to zero. The slope fit is 0.825 and the R2 value is 0.429 over the resolved fluence region. The removal of an outlier, Z=14, gives values of 0.888 and 0.705 for slope and R2 respectively. c2002 COSPAR. Published by Elsevier Science Ltd. All rights reserved.

  6. Analysis code for large rupture accidents in ATR. SENHOR/FLOOD/HEATUP

    International Nuclear Information System (INIS)

    1997-08-01

    In the evaluation of thermo-hydraulic transient change, the behavior of core reflooding and the transient change of fuel temperature in the events which are classified in large rupture accidents of reactor coolant loss, that is the safety evaluation event of the ATR, the analysis codes for thermo-hydraulic transient change at the time of large rupture SENHOR, for core reflooding characteristics FLOOD and for fuel temperature HEATUP are used, respectively. The analysis code system for loss of coolant accident comprises the analysis code for thermo-hydraulic transient change at the time of medium and small ruptures LOTRAC in addition to the above three codes. Based on the changes with time lapse of reactor thermal output and steam drum pressure obtained by the SENHOR, average reflooding rate is analyzed by the FLOOD, and the time of starting the turnaround of fuel cladding tube temperature and the heat transfer rate after the turnaround are determined. Based on these data, the detailed temperature change of fuel elements is analyzed by the HEATUP, and the highest temperature and the amount of oxidation of fuel cladding tubes are determined. The SENHOR code, the FLOOD code and the HEATUP code and various models for these codes are explained. The example of evaluation and the sensitivity analysis of the ATR plant are reported in the Appendix. (K.I.)

  7. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Ilow Jacek

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of information packets to construct redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  8. Particle-in-Cell Codes for plasma-based particle acceleration

    CERN Document Server

    Pukhov, Alexander

    2016-01-01

    Basic principles of particle-in-cell (PIC ) codes with the main application for plasma-based acceleration are discussed. The ab initio full electromagnetic relativistic PIC codes provide the most reliable description of plasmas. Their properties are considered in detail. Representing the most fundamental model, the full PIC codes are computationally expensive. The plasma-based acceler- ation is a multi-scale problem with very disparate scales. The smallest scale is the laser or plasma wavelength (from one to hundred microns) and the largest scale is the acceleration distance (from a few centimeters to meters or even kilometers). The Lorentz-boost technique allows to reduce the scale disparity at the costs of complicating the simulations and causing unphysical numerical instabilities in the code. Another possibility is to use the quasi-static approxi- mation where the disparate scales are separated analytically.

  9. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  10. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    International Nuclear Information System (INIS)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H.

    2014-08-01

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  11. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  12. Web- and system-code based, interactive, nuclear power plant simulators

    International Nuclear Information System (INIS)

    Kim, K. D.; Jain, P.; Rizwan, U.

    2006-01-01

    Using two different approaches, on-line, web- and system-code based graphical user interfaces have been developed for reactor system analysis. Both are LabVIEW (graphical programming language developed by National Instruments) based systems that allow local users as well as those at remote sites to run, interact and view the results of the system code in a web browser. In the first approach, only the data written by the system code in a tab separated ASCII output file is accessed and displayed graphically. In the second approach, LabVIEW virtual instruments are coupled with the system code as dynamic link libraries (DLL). RELAP5 is used as the system code to demonstrate the capabilities of these approaches. From collaborative projects between teams in geographically remote locations to providing system code experience to distance education students, these tools can be very beneficial in many areas of teaching and R and D. (authors)

  13. Unfolding code for neutron spectrometry based on neural nets technology

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the R obust Design of Artificial Neural Networks Methodology . The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  14. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  15. Simulation of thermal-neutron-induced single-event upset using particle and heavy-ion transport code system

    International Nuclear Information System (INIS)

    Arita, Yutaka; Kihara, Yuji; Mitsuhasi, Junichi; Niita, Koji; Takai, Mikio; Ogawa, Izumi; Kishimoto, Tadafumi; Yoshihara, Tsutomu

    2007-01-01

    The simulation of a thermal-neutron-induced single-event upset (SEU) was performed on a 0.4-μm-design-rule 4 Mbit static random access memory (SRAM) using particle and heavy-ion transport code system (PHITS): The SEU rates obtained by the simulation were in very good agreement with the result of experiments. PHITS is a useful tool for simulating SEUs in semiconductor devices. To further improve the accuracy of the simulation, additional methods for tallying the energy deposition are required for PHITS. (author)

  16. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  17. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    Science.gov (United States)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  18. Central Decoding for Multiple Description Codes based on Domain Partitioning

    Directory of Open Access Journals (Sweden)

    M. Spiertz

    2006-01-01

    Full Text Available Multiple Description Codes (MDC can be used to trade redundancy against packet loss resistance for transmitting data over lossy diversity networks. In this work we focus on MD transform coding based on domain partitioning. Compared to Vaishampayan’s quantizer based MDC, domain based MD coding is a simple approach for generating different descriptions, by using different quantizers for each description. Commonly, only the highest rate quantizer is used for reconstruction. In this paper we investigate the benefit of using the lower rate quantizers to enhance the reconstruction quality at decoder side. The comparison is done on artificial source data and on image data. 

  19. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Jacek Ilow

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of k information packets to construct r redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of k information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of n=k+r received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  20. Development and improvement of safety analysis code for geological disposal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  1. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  2. Unfolding code for neutron spectrometry based on neural nets technology

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Apdo. Postal 336, 98000 Zacatecas (Mexico)

    2012-10-15

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the {sup R}obust Design of Artificial Neural Networks Methodology{sup .} The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a {sup 6}Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  3. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  4. Sequence Coding and Search System Backfit Quality Assurance Program Plan

    International Nuclear Information System (INIS)

    Lovell, C.J.; Stepina, P.L.

    1985-03-01

    The Sequence Coding and Search System is a computer-based encoding system for events described in Licensee Event Reports. This data system contains LERs from 1981 to present. Backfit of the data system to include LERs prior to 1981 is required. This report documents the Quality Assurance Program Plan that EG and G Idaho, Inc. will follow while encoding 1980 LERs

  5. PERMUTATION-BASED POLYMORPHIC STEGO-WATERMARKS FOR PROGRAM CODES

    Directory of Open Access Journals (Sweden)

    Denys Samoilenko

    2016-06-01

    Full Text Available Purpose: One of the most actual trends in program code protection is code marking. The problem consists in creation of some digital “watermarks” which allow distinguishing different copies of the same program codes. Such marks could be useful for authority protection, for code copies numbering, for program propagation monitoring, for information security proposes in client-server communication processes. Methods: We used the methods of digital steganography adopted for program codes as text objects. The same-shape symbols method was transformed to same-semantic element method due to codes features which makes them different from ordinary texts. We use dynamic principle of marks forming making codes similar to be polymorphic. Results: We examined the combinatorial capacity of permutations possible in program codes. As a result it was shown that the set of 5-7 polymorphic variables is suitable for the most modern network applications. Marks creation and restoration algorithms where proposed and discussed. The main algorithm is based on full and partial permutations in variables names and its declaration order. Algorithm for partial permutation enumeration was optimized for calculation complexity. PHP code fragments which realize the algorithms were listed. Discussion: Methodic proposed in the work allows distinguishing of each client-server connection. In a case if a clone of some network resource was found the methodic could give information about included marks and thereby data on IP, date and time, authentication information of client copied the resource. Usage of polymorphic stego-watermarks should improve information security indexes in network communications.

  6. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  7. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    Science.gov (United States)

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  8. Edge-preserving Intra Depth Coding based on Context-coding and H.264/AVC

    DEFF Research Database (Denmark)

    Zamarin, Marco; Salmistraro, Matteo; Forchhammer, Søren

    2013-01-01

    Depth map coding plays a crucial role in 3D Video communication systems based on the “Multi-view Video plus Depth” representation as view synthesis performance is strongly affected by the accuracy of depth information, especially at edges in the depth map image. In this paper an efficient algorithm...... for edge-preserving intra depth compression based on H.264/AVC is presented. The proposed method introduces a new Intra mode specifically targeted to depth macroblocks with arbitrarily shaped edges, which are typically not efficiently represented by DCT. Edge macroblocks are partitioned into two regions...... each approximated by a flat surface. Edge information is encoded by means of contextcoding with an adaptive template. As a novel element, the proposed method allows exploiting the edge structure of previously encoded edge macroblocks during the context-coding step to further increase compression...

  9. The Monte Carlo event generator DPMJET-III

    International Nuclear Information System (INIS)

    Roesler, S.; Engel, R.

    2001-01-01

    A new version of the Monte Carlo event generator DPMJET is presented. It is a code system based on the Dual Parton Model and unifies all features of the DTUNUC-2, DPMJET-II and PHOJET1.12 event generators. DPMJET-III allows the simulation of hadron-hadron, hadron-nucleus, nucleus-nucleus, photon-hadron, photon-photon and photon-nucleus interactions from a few GeV up to the highest cosmic ray energies. (orig.)

  10. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  11. The structure of affective action representations: temporal binding of affective response codes.

    Science.gov (United States)

    Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard

    2012-01-01

    Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.

  12. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  13. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  14. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  15. Application of the ruthenium and technetium thermodynamic data bases used in the EQ3/6 geochemical codes

    Energy Technology Data Exchange (ETDEWEB)

    Isherwood, D.

    1985-04-01

    Based on a critical review of the available thermodynamic data, computerized data bases for technetium and ruthenium were created for use with the EQ3/6 geochemical computer codes. The technetium data base contains thermodynamic data for 8 aqueous species and 15 solids; 26 aqueous species and 9 solids were included in the ruthenium data base. The EQ3NR code was used to calculate solubility limits for ruthenium (8 x 10{sup -16} M) in ground water from Yucca Mountain, a potential nuclear waste repository site near the Nevada Test Site (NTS). The code confirmed the essentially unlimited solubility of technetium in oxidizing conditions, such as those that are believed to exist in the unsaturated zone at Yucca Mountain and the Cambric Nuclear event site at the NTS. Ruthenium migration observed from the Cambric site was evaluated. The solubility limit for ruthenium (as the aqueous species RuO{sub 4}{sup -}) when constrained by RuO{sub 2} is approximately equal to the concentration of ruthenium found in the cavity ground water (i.e., 2.1 x 10{sup -11} vs 4.5 x 10{sup -11} M). Differences in ruthenium solubility limits between Yucca Mountain and Cambric are primarily due to differences in ground-water pH. Technetium solubility (3 x 10{sup -8} M) for moderately reducing conditions (Eh = -0.1 V) using the metastable oxide, TcO{sub 2}.2H{sub 2}O, as the solubility constraint is within the range of experimental values recently published in a study of technetium sorption on basalt. Previously published technetium solubilities of 10{sup -12} to 10{sup -16} M were apparently based on a technetium data base that did not include aqueous species other than TcO{sub 4}{sup -}. When TcO(OH){sub 2}{sup 0} is included in the data base, the calculated values are much closer to the experimental results. Eh-pH diagrams were also generated for a variety of conditions using the SOLUPLOT code.

  16. Application of the ruthenium and technetium thermodynamic data bases used in the EQ3/6 geochemical codes

    International Nuclear Information System (INIS)

    Isherwood, D.

    1985-04-01

    Based on a critical review of the available thermodynamic data, computerized data bases for technetium and ruthenium were created for use with the EQ3/6 geochemical computer codes. The technetium data base contains thermodynamic data for 8 aqueous species and 15 solids; 26 aqueous species and 9 solids were included in the ruthenium data base. The EQ3NR code was used to calculate solubility limits for ruthenium (8 x 10 -16 M) in ground water from Yucca Mountain, a potential nuclear waste repository site near the Nevada Test Site (NTS). The code confirmed the essentially unlimited solubility of technetium in oxidizing conditions, such as those that are believed to exist in the unsaturated zone at Yucca Mountain and the Cambric Nuclear event site at the NTS. Ruthenium migration observed from the Cambric site was evaluated. The solubility limit for ruthenium (as the aqueous species RuO 4 - ) when constrained by RuO 2 is approximately equal to the concentration of ruthenium found in the cavity ground water (i.e., 2.1 x 10 -11 vs 4.5 x 10 -11 M). Differences in ruthenium solubility limits between Yucca Mountain and Cambric are primarily due to differences in ground-water pH. Technetium solubility (3 x 10 -8 M) for moderately reducing conditions (Eh = -0.1 V) using the metastable oxide, TcO 2 .2H 2 O, as the solubility constraint is within the range of experimental values recently published in a study of technetium sorption on basalt. Previously published technetium solubilities of 10 -12 to 10 -16 M were apparently based on a technetium data base that did not include aqueous species other than TcO 4 - . When TcO(OH) 2 0 is included in the data base, the calculated values are much closer to the experimental results. Eh-pH diagrams were also generated for a variety of conditions using the SOLUPLOT code

  17. Construction of Quasi-Cyclic LDPC Codes Based on Fundamental Theorem of Arithmetic

    Directory of Open Access Journals (Sweden)

    Hai Zhu

    2018-01-01

    Full Text Available Quasi-cyclic (QC LDPC codes play an important role in 5G communications and have been chosen as the standard codes for 5G enhanced mobile broadband (eMBB data channel. In this paper, we study the construction of QC LDPC codes based on an arbitrary given expansion factor (or lifting degree. First, we analyze the cycle structure of QC LDPC codes and give the necessary and sufficient condition for the existence of short cycles. Based on the fundamental theorem of arithmetic in number theory, we divide the integer factorization into three cases and present three classes of QC LDPC codes accordingly. Furthermore, a general construction method of QC LDPC codes with girth of at least 6 is proposed. Numerical results show that the constructed QC LDPC codes perform well over the AWGN channel when decoded with the iterative algorithms.

  18. Electromagnetic Dissociation and Spacecraft Electronics Damage

    Science.gov (United States)

    Norbury, John W.

    2016-01-01

    When protons or heavy ions from galactic cosmic rays (GCR) or solar particle events (SPE) interact with target nuclei in spacecraft, there can be two different types of interactions. The more familiar strong nuclear interaction often dominates and is responsible for nuclear fragmentation in either the GCR or SPE projectile nucleus or the spacecraft target nucleus. (Of course, the proton does not break up, except possibly to produce pions or other hadrons.) The less familiar, second type of interaction is due to the very strong electromagnetic fields that exist when two charged nuclei pass very close to each other. This process is called electromagnetic dissociation (EMD) and primarily results in the emission of neutrons, protons and light ions (isotopes of hydrogen and helium). The cross section for particle production is approximately defined as the number of particles produced in nucleus-nucleus collisions or other types of reactions. (There are various kinematic and other factors which multiply the particle number to arrive at the cross section.) Strong, nuclear interactions usually dominate the nuclear reactions of most interest that occur between GCR and target nuclei. However, for heavy nuclei (near Fe and beyond) at high energy the EMD cross section can be much larger than the strong nuclear interaction cross section. This paper poses a question: Are there projectile or target nuclei combinations in the interaction of GCR or SPE where the EMD reaction cross section plays a dominant role? If the answer is affirmative, then EMD mechanisms should be an integral part of codes that are used to predict damage to spacecraft electronics. The question can become more fine-tuned and one can ask about total reaction cross sections as compared to double differential cross sections. These issues will be addressed in the present paper.

  19. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  20. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  1. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  2. A neutron spectrum unfolding computer code based on artificial neural networks

    International Nuclear Information System (INIS)

    Ortiz-Rodríguez, J.M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J.M.; Vega-Carrillo, H.R.

    2014-01-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding

  3. Analysis of MSGTR events for APR1400 by means of best estimate thermal-hydraulic system code

    International Nuclear Information System (INIS)

    Jeong, Ji Hwan; Kim, Sang Jae; Chang, Keun Sun; Lee, Jae Hun

    2001-01-01

    A multiple steam generator tube rupture (MSGTR) event has never occurred in the history of commercial nuclear reactor operation while single steam generator tube rupture (SGTR) event is reported to occur every two years. As there is no history of MSGTR event, the understandings of transients and consequences of this event are not so much. In this study, a postulated MSGTR event in advanced power reactor 1400 (APR1400) is analyzed using thermal-hydraulic system code. The APR 1400 is a two-loop, 1000 MWe, PWR supposed to be built in 2009. MARS1.4 is used in this study. The present study aims to understand the effects of rupture location in heat transfer tubes and selection of affected steam generator following a MSGTR event. The effects of five tube rupture locations are compared with each other. The comparison shows that the response of APR1400 is to allow shortest time for operator action following a tubes rupture in the vicinity of hot-leg side tube sheet and to allow longest time following a tube ruptures at the tube top. The MSSV lift time for rupture at tube-top is evaluated as 24.5% larger than that for rupture at hot-leg side tube sheet. Also, the MSSV lift time for four cases are compared in order to examine how long operator response time is allowed depending on which steam generator is affected. The comparison shows that the cases for both of two steam generators are affected allow longer time for operator action compared with the cases that a single steam generator is affected. Further more, the tube ruptures in the steam generator where a pressurizer is linked leads to the shortest operator response time

  4. Safety studies of plasma-wall events with AINA code for Japanese DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Rivas, J.C., E-mail: jose.carlos.rivas@upc.edu [International Fusion Energy Research Centre (IFERC) (Japan); Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia-BarcelonaTech (Spain); Nakamura, M.; Someya, Y.; Hoshino, K.; Asakura, N. [Japan Atomic Energy Agency (JAEA) (Japan); Takase, H. [International Fusion Energy Research Centre (IFERC) (Japan); Miyoshi, Y.; Utoh, H.; Tobita, K. [Japan Atomic Energy Agency (JAEA) (Japan); Dies, J.; Blas, A. de; Riego, A.; Fabbri, M. [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia-BarcelonaTech (Spain)

    2016-11-01

    Highlights: • Work done in AINA code during 2014 and 2015 at IFERC to develop a version for safety studies of a Japanese DEMO design. • A thermal model for a WCPB breeding blanket has been developed based in parametric input data from neutronics calculations. • A breakthrough for the safety studies of plasma-divertor transients: An integrated SOL-pedestal model + using melting time as objective variable + using optimization algorithm. • The results for the case of divertor show that both loss of plasma control (LOPC) transients and ex-vessel LOCA transient can induce severe melting. The difference is that while in the first case melting happens at PFC surface, in the second case it happens at copper heat sink. • Conclusions suggest that, because the minimum melting times are same order of magnitude than the energy confinement time, recovery time for plasma control system should be lower order. - Abstract: In this contribution, the work done in AINA code during 2014 and 2015 at IFERC is presented. The main motivation of this work was to adapt the code and to perform safety studies for a Japanese DEMO design. Related to AINA code, the work has supposed major changes in plasma models. Significant is the addition of an integrated SOL-pedestal model that allows the estimation of heat loads at divertor. Also, a thermal model for a WCPB (water cooled pebble bed) breeding blanket has been developed based in parametric input data from neutronics calculations. Related to safety studies, a major breakthrough in the study of LOPC (loss of plasma control) transients has been the use of an optimization method to determine the most severe transients in terms of the shortest melting times. The results of the safety study show that LOPC transients are not likely to be severe for breeding blanket, but for the case of divertor can induce severe melting. For ex-vessel LOCA (loss of coolant accident) analysis, it is severe for both blanket and divertor, but in the first case

  5. Safety studies of plasma-wall events with AINA code for Japanese DEMO

    International Nuclear Information System (INIS)

    Rivas, J.C.; Nakamura, M.; Someya, Y.; Hoshino, K.; Asakura, N.; Takase, H.; Miyoshi, Y.; Utoh, H.; Tobita, K.; Dies, J.; Blas, A. de; Riego, A.; Fabbri, M.

    2016-01-01

    Highlights: • Work done in AINA code during 2014 and 2015 at IFERC to develop a version for safety studies of a Japanese DEMO design. • A thermal model for a WCPB breeding blanket has been developed based in parametric input data from neutronics calculations. • A breakthrough for the safety studies of plasma-divertor transients: An integrated SOL-pedestal model + using melting time as objective variable + using optimization algorithm. • The results for the case of divertor show that both loss of plasma control (LOPC) transients and ex-vessel LOCA transient can induce severe melting. The difference is that while in the first case melting happens at PFC surface, in the second case it happens at copper heat sink. • Conclusions suggest that, because the minimum melting times are same order of magnitude than the energy confinement time, recovery time for plasma control system should be lower order. - Abstract: In this contribution, the work done in AINA code during 2014 and 2015 at IFERC is presented. The main motivation of this work was to adapt the code and to perform safety studies for a Japanese DEMO design. Related to AINA code, the work has supposed major changes in plasma models. Significant is the addition of an integrated SOL-pedestal model that allows the estimation of heat loads at divertor. Also, a thermal model for a WCPB (water cooled pebble bed) breeding blanket has been developed based in parametric input data from neutronics calculations. Related to safety studies, a major breakthrough in the study of LOPC (loss of plasma control) transients has been the use of an optimization method to determine the most severe transients in terms of the shortest melting times. The results of the safety study show that LOPC transients are not likely to be severe for breeding blanket, but for the case of divertor can induce severe melting. For ex-vessel LOCA (loss of coolant accident) analysis, it is severe for both blanket and divertor, but in the first case

  6. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  7. 3D Scan-Based Wavelet Transform and Quality Control for Video Coding

    Directory of Open Access Journals (Sweden)

    Parisot Christophe

    2003-01-01

    Full Text Available Wavelet coding has been shown to achieve better compression than DCT coding and moreover allows scalability. 2D DWT can be easily extended to 3D and thus applied to video coding. However, 3D subband coding of video suffers from two drawbacks. The first is the amount of memory required for coding large 3D blocks; the second is the lack of temporal quality due to the sequence temporal splitting. In fact, 3D block-based video coders produce jerks. They appear at blocks temporal borders during video playback. In this paper, we propose a new temporal scan-based wavelet transform method for video coding combining the advantages of wavelet coding (performance, scalability with acceptable reduced memory requirements, no additional CPU complexity, and avoiding jerks. We also propose an efficient quality allocation procedure to ensure a constant quality over time.

  8. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  9. Event simulation for the WA80 experiment

    International Nuclear Information System (INIS)

    Sorensen, S.P.

    1986-01-01

    The HIJET and LUND event generators are compared. It is concluded that for detector construction and design of experimental setups, the differences between the two models are marginal. The coverage of the WA80 setup in pseudorapidity and energy is demonstrated. The performance of some of the WA80 detectors (zero-degree calorimeter, wall calorimeter, multiplicity array, and SAPHIR lead-glass detector) is evaluated based on calculations with the LUND or the HIJET codes combined with codes simulating the detector responses. 9 refs., 3 figs

  10. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  11. Warped Discrete Cosine Transform-Based Low Bit-Rate Block Coding Using Image Downsampling

    Directory of Open Access Journals (Sweden)

    Ertürk Sarp

    2007-01-01

    Full Text Available This paper presents warped discrete cosine transform (WDCT-based low bit-rate block coding using image downsampling. While WDCT aims to improve the performance of conventional DCT by frequency warping, the WDCT has only been applicable to high bit-rate coding applications because of the overhead required to define the parameters of the warping filter. Recently, low bit-rate block coding based on image downsampling prior to block coding followed by upsampling after the decoding process is proposed to improve the compression performance for low bit-rate block coders. This paper demonstrates that a superior performance can be achieved if WDCT is used in conjunction with image downsampling-based block coding for low bit-rate applications.

  12. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2007-01-01

    understand the factors that might be responsible for such trend and to take corrective actions prior to the escalation to a significant event. Reviewed and analyzed data based on codes trending identified common problems, potential trends and common contributors, promote a good trending program. For the effective trending program, positive adverse trends identification and corrective actions that are addressed the weaknesses that have been identified, should be specified and implemented through the corrective action program. For that purpose the appropriate coding system incorporated into Corrective Action and Operating Experience Program is established at Krsko NPP. Minor events and near misses are collected and analyzed in order to aggregate detected minor problems. The different groups of codes developed include codes for direct causes and casual factors, processes and organizations, consequences, level of significance etc. For easier trending and further analysis a different code combinations were utilized in a form of graphs. For example: organisation vs. causal factors (allows particular department to trend human performance in their own organisation), direct cause vs. time (allows trending of equipment degradation), processes vs. organisation (allows trending 501.2 of processes degradation in particular organisation) any code in question vs. time (for trend confirmation) etc. The purpose of this article is to present the coding system established at the Krsko Nuclear Power Plant and variety of ways for trending by using the system. The article deals with the codes established, organization of code system, trend codes combinations and benefit for early recognizing adverse trends of lo-level events. (author)

  13. Modeling Space Radiation with Bleomycin

    Data.gov (United States)

    National Aeronautics and Space Administration — Space radiation is a mixed field of solar particle events (proton) and particles of Galactic Cosmic Rays (GCR) with different energy levels. These radiation events...

  14. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  15. Context based Coding of Quantized Alpha Planes for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2002-01-01

    In object based video, each frame is a composition of objects that are coded separately. The composition is performed through the alpha plane that represents the transparency of the object. We present an alternative to MPEG-4 for coding of alpha planes that considers their specific properties....... Comparisons in terms of rate and distortion are provided, showing that the proposed coding scheme for still alpha planes is better than the algorithms for I-frames used in MPEG-4....

  16. A neutron spectrum unfolding computer code based on artificial neural networks

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2014-02-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding in

  17. Lossy to lossless object-based coding of 3-D MRI data.

    Science.gov (United States)

    Menegaz, Gloria; Thiran, Jean-Philippe

    2002-01-01

    We propose a fully three-dimensional (3-D) object-based coding system exploiting the diagnostic relevance of the different regions of the volumetric data for rate allocation. The data are first decorrelated via a 3-D discrete wavelet transform. The implementation via the lifting steps scheme allows to map integer-to-integer values, enabling lossless coding, and facilitates the definition of the object-based inverse transform. The coding process assigns disjoint segments of the bitstream to the different objects, which can be independently accessed and reconstructed at any up-to-lossless quality. Two fully 3-D coding strategies are considered: embedded zerotree coding (EZW-3D) and multidimensional layered zero coding (MLZC), both generalized for region of interest (ROI)-based processing. In order to avoid artifacts along region boundaries, some extra coefficients must be encoded for each object. This gives rise to an overheading of the bitstream with respect to the case where the volume is encoded as a whole. The amount of such extra information depends on both the filter length and the decomposition depth. The system is characterized on a set of head magnetic resonance images. Results show that MLZC and EZW-3D have competitive performances. In particular, the best MLZC mode outperforms the others state-of-the-art techniques on one of the datasets for which results are available in the literature.

  18. Event-by-Event Simulation of Induced Fission

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, R; Randrup, J

    2007-12-13

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  19. Event-by-Event Simulation of Induced Fission

    Science.gov (United States)

    Vogt, Ramona; Randrup, Jørgen

    2008-04-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented.

  20. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, Ramona; Randrup, Joergen

    2008-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either de-excite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission pre-fragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  1. Event-by-Event Simulation of Induced Fission

    International Nuclear Information System (INIS)

    Vogt, R; Randrup, J

    2007-01-01

    We are developing a novel code that treats induced fission by statistical (or Monte-Carlo) simulation of individual decay chains. After its initial excitation, the fissionable compound nucleus may either deexcite by evaporation or undergo binary fission into a large number of fission channels each with different energetics involving both energy dissipation and deformed scission prefragments. After separation and Coulomb acceleration, each fission fragment undergoes a succession of individual (neutron) evaporations, leading to two bound but still excited fission products (that may further decay electromagnetically and, ultimately, weakly), as well as typically several neutrons. (The inclusion of other possible ejectiles is planned.) This kind of approach makes it possible to study more detailed observables than could be addressed with previous treatments which have tended to focus on average quantities. In particular, any type of correlation observable can readily be extracted from a generated set of events. With a view towards making the code practically useful in a variety of applications, emphasis is being put on making it numerically efficient so that large event samples can be generated quickly. In its present form, the code can generate one million full events in about 12 seconds on a MacBook laptop computer. The development of this qualitatively new tool is still at an early stage and quantitative reproduction of existing data should not be expected until a number of detailed refinement have been implemented

  2. A novel construction method of QC-LDPC codes based on CRT for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  3. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Chengbin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Cheng, Maosong, E-mail: mscheng@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Liu, Guimin [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China)

    2016-08-15

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  4. Development and application of a system analysis code for liquid fueled molten salt reactors based on RELAP5 code

    International Nuclear Information System (INIS)

    Shi, Chengbin; Cheng, Maosong; Liu, Guimin

    2016-01-01

    Highlights: • New point kinetics and thermo-hydraulics models as well as a numerical method are added into RELAP5 code to be suitable for liquid fueled molten salt reactor. • The extended REALP5 code is verified by the experimental benchmarks of MSRE. • The different transient scenarios of the MSBR are simulated to evaluate performance during the transients. - Abstract: The molten salt reactor (MSR) is one of the six advanced reactor concepts declared by the Generation IV International Forum (GIF), which can be characterized by attractive attributes as inherent safety, economical efficiency, natural resource protection, sustainable development and nuclear non-proliferation. It is important to make system safety analysis for nuclear power plant of MSR. In this paper, in order to developing a system analysis code suitable for liquid fueled molten salt reactors, the point kinetics and thermo-hydraulic models as well as the numerical method in thermal–hydraulic transient code Reactor Excursion and Leak Analysis Program (RELAP5) developed at the Idaho National Engineering Laboratory (INEL) for the U.S. Nuclear Regulatory Commission (NRC) are extended and verified by Molten Salt Reactor Experiment (MSRE) experimental benchmarks. And then, four transient scenarios including the load demand change, the primary flow transient, the secondary flow transient and the reactivity transient of the Molten Salt Breeder Reactor (MSBR) are modeled and simulated so as to evaluate the performance of the reactor during the anticipated transient events using the extended RELAP5 code. The results indicate the extended RELAP5 code is effective and well suited to the liquid fueled molten salt reactor, and the MSBR has strong inherent safety characteristics because of its large negative reactivity coefficient. In the future, the extended RELAP5 code will be used to perform transient safety analysis for a liquid fueled thorium molten salt reactor named TMSR-LF developed by the Center

  5. The sequence coding and search system: An approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1989-04-01

    The US Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedstock of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrences of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objective SCSS, the information it contains, and the format and approach for constructuring SCSS event sequences. Examples are presented demonstrating the use SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted

  6. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events

    DEFF Research Database (Denmark)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz

    2018-01-01

    events overall, by gender, age groups, and calendar time. RESULTS: We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied...... further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). CONCLUSIONS: The DK-algorithm can reliably identify self-harm with suicidal intention in 52......PURPOSE: It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK...

  7. Development of best estimate auditing code for CANDU thermal-hydraulic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Won Jae; Hwang, Moon Kyu; Lim, Hong Sik [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    The main purpose of this study is to develop a thermal hydraulic auditing code for the CANDU reactor, modifying the model of existing PWR auditing tool, i.e. RELAP5/MOD3.The study was performed by reconsideration of the previous code assessment works and phenomena identification for essential accident scenario. Improvement areas of model development for auditing tool were identified based on the code comparison and PIRT results. Nine models have been improved significantly for the analysis of LOCA and Mon LOCA event. Conceptual problem or separate effect assessment have been performed to verify the model improvement. The linking calculation with CONTAIN 2.0 has been also enabled to establish the unified auditing code system. Analysis for the CANDU plant real transient and hypothetical LOCA bas been performed using the improved version. It has been concluded that the developed version can be utilized for the auditing analysis of LOCA and non-LOCA event for the CANDU reactor. 25 refs., 84 figs., 36 tabs. (Author)

  8. Field-based tests of geochemical modeling codes: New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1993-12-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  9. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  10. Wavelet based multicarrier code division multiple access ...

    African Journals Online (AJOL)

    This paper presents the study on Wavelet transform based Multicarrier Code Division Multiple Access (MC-CDMA) system for a downlink wireless channel. The performance of the system is studied for Additive White Gaussian Noise Channel (AWGN) and slowly varying multipath channels. The bit error rate (BER) versus ...

  11. Simulation of overpressure events with a Laguna Verde model for the RELAP code to conditions of extended power up rate

    International Nuclear Information System (INIS)

    Rodriguez H, A.; Araiza M, E.; Fuentes M, L.; Ortiz V, J.

    2012-10-01

    In this work the main results of the simulation of overpressure events are presented using a model of the nuclear power plant of Laguna Verde developed for the RELAP/SCDAPSIM code. As starting point we have the conformation of a Laguna Verde model that represents a stationary state to similar conditions to the operation of the power station with Extended Power Up rate (EPU). The transitory of simulated pressure are compared with those documented in the Final Safety Analysis Report of Laguna Verde (FSAR). The results of the turbine shot transitory with and without by-pass of the main turbine are showed, and the event of closes of all the valves of main vapor isolation. A preliminary simulation was made and with base in the results some adjustments were made for the operation with EPU, taking into account the Operation Technical Specifications of the power station. The results of the final simulations were compared and analyzed with the content in the FSAR. The response of the power station to the transitory, reflected in the model for RELAP, was satisfactory. Finally, comments about the improvement of the model are included, for example, the response time of the protection and mitigation systems of the power station. (Author)

  12. Lossless Image Compression Based on Multiple-Tables Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Rung-Ching Chen

    2009-01-01

    Full Text Available This paper is intended to present a lossless image compression method based on multiple-tables arithmetic coding (MTAC method to encode a gray-level image f. First, the MTAC method employs a median edge detector (MED to reduce the entropy rate of f. The gray levels of two adjacent pixels in an image are usually similar. A base-switching transformation approach is then used to reduce the spatial redundancy of the image. The gray levels of some pixels in an image are more common than those of others. Finally, the arithmetic encoding method is applied to reduce the coding redundancy of the image. To promote high performance of the arithmetic encoding method, the MTAC method first classifies the data and then encodes each cluster of data using a distinct code table. The experimental results show that, in most cases, the MTAC method provides a higher efficiency in use of storage space than the lossless JPEG2000 does.

  13. A seismic data compression system using subband coding

    Science.gov (United States)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  14. External main-induced events in relation to nuclear power plant siting

    International Nuclear Information System (INIS)

    1981-01-01

    This safety Guide recomments procedures and provides information for use in implementing that part of the code of safety in Nuclear Power Plant Siting (IAEA Safety Series No. 50-C-S) which concerns man-induced events external to the plant, up to the evaluation of corresponding design basis parameters. Like the code, the Guide forms part of the IAEA's programme, referred to as the NUSS programme, for establishing codes of practice and safety Guides relating to land-based stationary thermal neutron power plants

  15. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  16. Coding paediatric outpatient data to provide health planners with information on children with chronic conditions and disabilities.

    Science.gov (United States)

    Craig, Elizabeth; Kerr, Neal; McDonald, Gabrielle

    2017-03-01

    In New Zealand, there is a paucity of information on children with chronic conditions and disabilities (CCD). One reason is that many are managed in hospital outpatients where diagnostic coding of health-care events does not occur. This study explores the feasibility of coding paediatric outpatient data to provide health planners with information on children with CCD. Thirty-seven clinicians from six District Health Boards (DHBs) trialled coding over 12 weeks. In five DHBs, the International Classification of Diseases and Related Health Problems, 10th Edition, Australian Modification (ICD-10-AM) and Systematised Nomenclature of Medicine Clinical Terms (SNOMED-CT) were trialled for 6 weeks each. In one DHB, ICD-10-AM was trialled for 12 weeks. A random sample (30%) of ICD-10-AM coded events were also coded by clinical coders. A mix of paper and electronic methods were used. In total 2,604 outpatient events were coded in ICD-10-AM and 693 in SNOMED-CT. Dual coding occurred for 770 (29.6%) ICD-10-AM events. Overall, 34% of ICD-10-AM and 40% of SNOMED-CT events were for developmental and behavioural disorders. Chronic medical conditions were also common. Clinicians were concerned about the workload impacts, particularly for paper-based methods. Coder's were concerned about clinician's adherence to coding guidelines and the poor quality of documentation in some notes. Coded outpatient data could provide planners with a rich source of information on children with CCD. However, coding is also resource intensive. Thus its costs need to be weighed against the costs of managing a much larger health budget using very limited information. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  17. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  18. Contributions to a Brazilian Code of Conduct for Fieldwork in Geology: an approach based on Geoconservation and Geoethics.

    Science.gov (United States)

    Mansur, Kátia L; Ponciano, Luiza C M O; Castro, Aline R S F DE

    2017-05-01

    When considering the numerous events that have prohibited the development of scientific projects or caused destruction of outcrops, it is clear that there is rapidly increasing necessity to define a Brazilian Code of Conduct for geological fieldwork. In general, this destruction is attributed to lack of knowledge as to the relevance of geological sites. The aim of this Code of Conduct is to guide geologists to adopt good practices during geoscience activities. Proposed guidelines are based on Codes of Conduct from other countries, mainly Scotland and England, on situations described in papers and on the personal experience of the authors. In this paper 29 points are suggested, in order to guarantee that fieldwork is conducted in accordance with geoethics, geoconservation and sustainability values. The proposal is structured in three parts: (1) Behavior and practices in respect to local traditions and providing information to the population; (2) Measures to minimize degradation on outcrops; and (3) Safety. The proposal seeks to broaden the debate on the need for responsible behavior during fieldwork, in order to promote respect for geodiversity. Through this code, Brazilian geoscientists will be able to contribute to the conservation of geological heritage and of outcrops with special educational relevance.

  19. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  20. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  1. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  2. Short-Block Protograph-Based LDPC Codes

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.

  3. Evolutionary implications of genetic code deviations

    International Nuclear Information System (INIS)

    Chela Flores, J.

    1986-07-01

    By extending the standard genetic code into a temperature dependent regime, we propose a train of molecular events leading to alternative coding. The first few examples of these deviations have already been reported in some ciliated protozoans and Gram positive bacteria. A possible range of further alternative coding, still within the context of universality, is pointed out. (author)

  4. Trellis-coded CPM for satellite-based mobile communications

    Science.gov (United States)

    Abrishamkar, Farrokh; Biglieri, Ezio

    1988-01-01

    Digital transmission for satellite-based land mobile communications is discussed. To satisfy the power and bandwidth limitations imposed on such systems, a combination of trellis coding and continuous-phase modulated signals are considered. Some schemes based on this idea are presented, and their performance is analyzed by computer simulation. The results obtained show that a scheme based on directional detection and Viterbi decoding appears promising for practical applications.

  5. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Science.gov (United States)

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  6. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Directory of Open Access Journals (Sweden)

    Yueying Wu

    Full Text Available High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI extraction using the high efficiency video coding (H.265/HEVC standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0. The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  7. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  8. Image Coding Based on Address Vector Quantization.

    Science.gov (United States)

    Feng, Yushu

    Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing

  9. The histone codes for meiosis.

    Science.gov (United States)

    Wang, Lina; Xu, Zhiliang; Khawar, Muhammad Babar; Liu, Chao; Li, Wei

    2017-09-01

    Meiosis is a specialized process that produces haploid gametes from diploid cells by a single round of DNA replication followed by two successive cell divisions. It contains many special events, such as programmed DNA double-strand break (DSB) formation, homologous recombination, crossover formation and resolution. These events are associated with dynamically regulated chromosomal structures, the dynamic transcriptional regulation and chromatin remodeling are mainly modulated by histone modifications, termed 'histone codes'. The purpose of this review is to summarize the histone codes that are required for meiosis during spermatogenesis and oogenesis, involving meiosis resumption, meiotic asymmetric division and other cellular processes. We not only systematically review the functional roles of histone codes in meiosis but also discuss future trends and perspectives in this field. © 2017 Society for Reproduction and Fertility.

  10. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    Science.gov (United States)

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  11. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  12. The PHREEQE Geochemical equilibrium code data base and calculations

    International Nuclear Information System (INIS)

    Andersoon, K.

    1987-01-01

    Compilation of a thermodynamic data base for actinides and fission products for use with PHREEQE has begun and a preliminary set of actinide data has been tested for the PHREEQE code in a version run on an IBM XT computer. The work until now has shown that the PHREEQE code mostly gives satisfying results for specification of actinides in natural water environment. For U and Np under oxidizing conditions, however, the code has difficulties to converge with pH and Eh conserved when a solubility limit is applied. For further calculations of actinide and fission product specification and solubility in a waste repository and in the surrounding geosphere, more data are needed. It is necessary to evaluate the influence of the large uncertainties of some data. A quality assurance and a check on the consistency of the data base is also needed. Further work with data bases should include: an extension to fission products, an extension to engineering materials, an extension to other ligands than hydroxide and carbonate, inclusion of more mineral phases, inclusion of enthalpy data, a control of primary references in order to decide if values from different compilations are taken from the same primary reference and contacts and discussions with other groups, working with actinide data bases, e.g. at the OECD/NEA and at the IAEA. (author)

  13. The assessment of containment codes by experiments simulating severe accident scenarios

    International Nuclear Information System (INIS)

    Karwat, H.

    1992-01-01

    Hitherto, a generally applicable validation matrix for codes simulating the containment behaviour under severe accident conditions did not exist. Past code applications have shown that most problems may be traced back to inaccurate thermalhydraulic parameters governing gas- or aerosol-distribution events. A provisional code-validation matrix is proposed, based on a careful selection of containment experiments performed during recent years in relevant test facilities under various operating conditions. The matrix focuses on the thermalhydraulic aspects of the containment behaviour after severe accidents as a first important step. It may be supplemented in the future by additional suitable tests

  14. A Low-Jitter Wireless Transmission Based on Buffer Management in Coding-Aware Routing

    Directory of Open Access Journals (Sweden)

    Cunbo Lu

    2015-08-01

    Full Text Available It is significant to reduce packet jitter for real-time applications in a wireless network. Existing coding-aware routing algorithms use the opportunistic network coding (ONC scheme in a packet coding algorithm. The ONC scheme never delays packets to wait for the arrival of a future coding opportunity. The loss of some potential coding opportunities may degrade the contribution of network coding to jitter performance. In addition, most of the existing coding-aware routing algorithms assume that all flows participating in the network have equal rate. This is unrealistic, since multi-rate environments often appear. To overcome the above problem and expand coding-aware routing to multi-rate scenarios, from the view of data transmission, we present a low-jitter wireless transmission algorithm based on buffer management (BLJCAR, which decides packets in coding node according to the queue-length based threshold policy instead of the regular ONC policy as used in existing coding-aware routing algorithms. BLJCAR is a unified framework to merge the single rate case and multiple rate case. Simulations results show that the BLJCAR algorithm embedded in coding-aware routing outperforms the traditional ONC policy in terms of jitter, packet delivery delay, packet loss ratio and network throughput in network congestion in any traffic rates.

  15. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  16. Diagnosis-based and external cause-based criteria to identify adverse drug reactions in hospital ICD-coded data: application to an Australia population-based study

    Directory of Open Access Journals (Sweden)

    Wei Du

    2017-04-01

    Full Text Available Objectives: External cause International Classification of Diseases (ICD codes are commonly used to ascertain adverse drug reactions (ADRs related to hospitalisation. We quantified ascertainment of ADR-related hospitalisation using external cause codes and additional ICD-based hospital diagnosis codes. Methods: We reviewed the scientific literature to identify different ICD-based criteria for ADR-related hospitalisations, developed algorithms to capture ADRs based on candidate hospital ICD-10 diagnoses and external cause codes (Y40–Y59, and incorporated previously published causality ratings estimating the probability that a specific diagnosis was ADR related. We applied the algorithms to the NSW Admitted Patient Data Collection records of 45 and Up Study participants (2011–2013. Results: Of 493 442 hospitalisations among 267 153 study participants during 2011–2013, 18.8% (n = 92 953 had hospital diagnosis codes that were potentially ADR related; 1.1% (n = 5305 had high/very high–probability ADR-related diagnosis codes (causality ratings: A1 and A2; and 2.0% (n = 10 039 had ADR-related external cause codes. Overall, 2.2% (n = 11 082 of cases were classified as including an ADR-based hospitalisation on either external cause codes or high/very high–probability ADR-related diagnosis codes. Hence, adding high/very high–probability ADR-related hospitalisation codes to standard external cause codes alone (Y40–Y59 increased the number of hospitalisations classified as having an ADR-related diagnosis by 10.4%. Only 6.7% of cases with high-probability ADR-related mental symptoms were captured by external cause codes. Conclusion: Selective use of high-probability ADR-related hospital diagnosis codes in addition to external cause codes yielded a modest increase in hospitalised ADR incidence, which is of potential clinical significance. Clinically validated combinations of diagnosis codes could potentially further enhance capture.

  17. LSB-Based Steganography Using Reflected Gray Code

    Science.gov (United States)

    Chen, Chang-Chu; Chang, Chin-Chen

    Steganography aims to hide secret data into an innocuous cover-medium for transmission and to make the attacker cannot recognize the presence of secret data easily. Even the stego-medium is captured by the eavesdropper, the slight distortion is hard to be detected. The LSB-based data hiding is one of the steganographic methods, used to embed the secret data into the least significant bits of the pixel values in a cover image. In this paper, we propose an LSB-based scheme using reflected-Gray code, which can be applied to determine the embedded bit from secret information. Following the transforming rule, the LSBs of stego-image are not always equal to the secret bits and the experiment shows that the differences are up to almost 50%. According to the mathematical deduction and experimental results, the proposed scheme has the same image quality and payload as the simple LSB substitution scheme. In fact, our proposed data hiding scheme in the case of G1 (one bit Gray code) system is equivalent to the simple LSB substitution scheme.

  18. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  19. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  20. Single-Trial Evoked Potential Estimating Based on Sparse Coding under Impulsive Noise Environment

    Directory of Open Access Journals (Sweden)

    Nannan Yu

    2018-01-01

    Full Text Available Estimating single-trial evoked potentials (EPs corrupted by the spontaneous electroencephalogram (EEG can be regarded as signal denoising problem. Sparse coding has significant success in signal denoising and EPs have been proven to have strong sparsity over an appropriate dictionary. In sparse coding, the noise generally is considered to be a Gaussian random process. However, some studies have shown that the background noise in EPs may present an impulsive characteristic which is far from Gaussian but suitable to be modeled by the α-stable distribution 1<α≤2. Consequently, the performances of general sparse coding will degrade or even fail. In view of this, we present a new sparse coding algorithm using p-norm optimization in single-trial EPs estimating. The algorithm can track the underlying EPs corrupted by α-stable distribution noise, trial-by-trial, without the need to estimate the α value. Simulations and experiments on human visual evoked potentials and event-related potentials are carried out to examine the performance of the proposed approach. Experimental results show that the proposed method is effective in estimating single-trial EPs under impulsive noise environment.

  1. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  2. GPU-accelerated 3D neutron diffusion code based on finite difference method

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Q.; Yu, G.; Wang, K. [Dept. of Engineering Physics, Tsinghua Univ. (China)

    2012-07-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  3. GPU-accelerated 3D neutron diffusion code based on finite difference method

    International Nuclear Information System (INIS)

    Xu, Q.; Yu, G.; Wang, K.

    2012-01-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  4. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  5. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  6. An event database for rotational seismology

    Science.gov (United States)

    Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner

    2016-04-01

    The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.

  7. Finger Vein Recognition Based on Local Directional Code

    Science.gov (United States)

    Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang

    2012-01-01

    Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP. PMID:23202194

  8. Finger Vein Recognition Based on Local Directional Code

    Directory of Open Access Journals (Sweden)

    Rongyang Xiao

    2012-11-01

    Full Text Available Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP, Local Derivative Pattern (LDP and Local Line Binary Pattern (LLBP. However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD, this paper represents a new direction based local descriptor called Local Directional Code (LDC and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP.

  9. Scintillator Based Coded-Aperture Imaging for Neutron Detection

    International Nuclear Information System (INIS)

    Hayes, Sean-C.; Gamage, Kelum-A-A.

    2013-06-01

    In this paper we are going to assess the variations of neutron images using a series of Monte Carlo simulations. We are going to study neutron images of the same neutron source with different source locations, using a scintillator based coded-aperture system. The Monte Carlo simulations have been conducted making use of the EJ-426 neutron scintillator detector. This type of detector has a low sensitivity to gamma rays and is therefore of particular use in a system with a source that emits a mixed radiation field. From the use of different source locations, several neutron images have been produced, compared both qualitatively and quantitatively for each case. This allows conclusions to be drawn on how suited the scintillator based coded-aperture neutron imaging system is to detecting various neutron source locations. This type of neutron imaging system can be easily used to identify and locate nuclear materials precisely. (authors)

  10. Triboelectric-Based Transparent Secret Code.

    Science.gov (United States)

    Yuan, Zuqing; Du, Xinyu; Li, Nianwu; Yin, Yingying; Cao, Ran; Zhang, Xiuling; Zhao, Shuyu; Niu, Huidan; Jiang, Tao; Xu, Weihua; Wang, Zhong Lin; Li, Congju

    2018-04-01

    Private and security information for personal identification requires an encrypted tool to extend communication channels between human and machine through a convenient and secure method. Here, a triboelectric-based transparent secret code (TSC) that enables self-powered sensing and information identification simultaneously in a rapid process method is reported. The transparent and hydrophobic TSC can be conformed to any cambered surface due to its high flexibility, which extends the application scenarios greatly. Independent of the power source, the TSC can induce obvious electric signals only by surface contact. This TSC is velocity-dependent and capable of achieving a peak voltage of ≈4 V at a resistance load of 10 MΩ and a sliding speed of 0.1 m s -1 , according to a 2 mm × 20 mm rectangular stripe. The fabricated TSC can maintain its performance after reciprocating rolling for about 5000 times. The applications of TSC as a self-powered code device are demonstrated, and the ordered signals can be recognized through the height of the electric peaks, which can be further transferred into specific information by the processing program. The designed TSC has great potential in personal identification, commodity circulation, valuables management, and security defense applications.

  11. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events.

    Science.gov (United States)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz; Pedersen, Carsten Bøcker; Mors, Ole; Christensen, Jakob

    2018-04-17

    It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK-algorithm) based on hospital discharge diagnosis and emergency room contacts. Of all 101 530 people identified with an incident suicide attempt/self-harm contact at Danish hospitals between 1995 and 2012 using the DK-algorithm, we selected a random sample of 475 people. We validated the DK-algorithm against medical records applying the definitions and terminology of the Columbia Classification Algorithm of Suicide Assessment of suicidal events, nonsuicidal events, and indeterminate or potentially suicidal events. We calculated positive predictive values (PPVs) of the DK-algorithm to identify suicidal events overall, by gender, age groups, and calendar time. We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). The DK-algorithm can reliably identify self-harm with suicidal intention in 52% of the identified cases of suicide attempts/self-harm. The PPVs could be used for quantitative bias analysis and implemented as weights in future studies to estimate the proportion of suicidal events among cases identified via the DK-algorithm. Copyright © 2018 John Wiley & Sons, Ltd.

  12. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  13. Variable disparity-motion estimation based fast three-view video coding

    Science.gov (United States)

    Bae, Kyung-Hoon; Kim, Seung-Cheol; Hwang, Yong Seok; Kim, Eun-Soo

    2009-02-01

    In this paper, variable disparity-motion estimation (VDME) based 3-view video coding is proposed. In the encoding, key-frame coding (KFC) based motion estimation and variable disparity estimation (VDE) for effectively fast three-view video encoding are processed. These proposed algorithms enhance the performance of 3-D video encoding/decoding system in terms of accuracy of disparity estimation and computational overhead. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm's PSNRs is 37.66 and 40.55 dB, and the processing time is 0.139 and 0.124 sec/frame, respectively.

  14. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus-nucleus reactions

    International Nuclear Information System (INIS)

    Sihver, L; Mancusi, D; Matthiae, D; Koi, T

    2008-01-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  15. Dose calculations at high altitudes and in deep space with GEANT4 using BIC and JQMD models for nucleus nucleus reactions

    Science.gov (United States)

    Sihver, L.; Matthiä, D.; Koi, T.; Mancusi, D.

    2008-10-01

    Radiation exposure of aircrew is more and more recognized as an occupational hazard. The ionizing environment at standard commercial aircraft flight altitudes consists mainly of secondary particles, of which the neutrons give a major contribution to the dose equivalent. Accurate estimations of neutron spectra in the atmosphere are therefore essential for correct calculations of aircrew doses. Energetic solar particle events (SPE) could also lead to significantly increased dose rates, especially at routes close to the North Pole, e.g. for flights between Europe and USA. It is also well known that the radiation environment encountered by personnel aboard low Earth orbit (LEO) spacecraft or aboard a spacecraft traveling outside the Earth's protective magnetosphere is much harsher compared with that within the atmosphere since the personnel are exposed to radiation from both galactic cosmic rays (GCR) and SPE. The relative contribution to the dose from GCR when traveling outside the Earth's magnetosphere, e.g. to the Moon or Mars, is even greater, and reliable and accurate particle and heavy ion transport codes are essential to calculate the radiation risks for both aircrew and personnel on spacecraft. We have therefore performed calculations of neutron distributions in the atmosphere, total dose equivalents, and quality factors at different depths in a water sphere in an imaginary spacecraft during solar minimum in a geosynchronous orbit. The calculations were performed with the GEANT4 Monte Carlo (MC) code using both the binary cascade (BIC) model, which is part of the standard GEANT4 package, and the JQMD model, which is used in the particle and heavy ion transport code PHITS GEANT4.

  16. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  17. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the networks successfully recover the original file. The experimental results show that secure network coding is very feasible and suitable for such file sharing. Moreover, the sharing efficiency and security outperform traditional replication-based sharing scheme.

  18. Field-based tests of geochemical modeling codes usign New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1994-06-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  19. Observation of galactic cosmic ray spallation events from the SoHO mission 20-Year operation of LASCO

    Science.gov (United States)

    Koutchmy, S.; Tavabi, E.; Urtado, O.

    2018-05-01

    A shower of secondary Cosmic Ray (CR) particles is produced at high altitudes in the Earth's atmosphere, so the primordial Galactic Cosmic Rays (GCRs) are never directly measured outside the Earth magnetosphere and atmosphere. They approach the Earth and other planets in the complex pattern of rigidity's dependence, generally excluded by the magnetosphere. GCRs revealed by images of single nuclear reactions also called spallation events are described here. Such an event was seen on Nov. 29, 2015 using a unique LASCO C3 space coronagraph routine image taken during the Solar and Heliospheric Observatory (SoHO) mission observing uninterruptedly at the Lagrangian L1 point. The spallation signature of a GCR identified well outside the Earth's magnetosphere is obtained for the 1st time. The resulting image includes different diverging linear "tracks" of varying intensity, leading to a single pixel; this frame identifies the site on the silicon CCD chip of the coronagraph camera. There was no solar flare reported at that time, nor Coronal Mass Ejection (CME) and no evidence of optical debris around the spacecraft. More examples of smaller CR events have been discovered through the 20 years of continuous observations from SoHO. This is the first spallation event from a CR, recorded outside the Earth's magnetosphere. We evaluate the probable energy of these events suggesting a plausible galactic source.

  20. Wired World-Wide Web Interactive Remote Event Display

    Energy Technology Data Exchange (ETDEWEB)

    De Groot, Nicolo

    2003-05-07

    WIRED (World-Wide Web Interactive Remote Event Display) is a framework, written in the Java{trademark} language, for building High Energy Physics event displays. An event display based on the WIRED framework enables users of a HEP collaboration to visualize and analyze events remotely using ordinary WWW browsers, on any type of machine. In addition, event displays using WIRED may provide the general public with access to the research of high energy physics. The recent introduction of the object-oriented Java{trademark} language enables the transfer of machine independent code across the Internet, to be safely executed by a Java enhanced WWW browser. We have employed this technology to create a remote event display in WWW. The combined Java-WWW technology hence assures a world wide availability of such an event display, an always up-to-date program and a platform independent implementation, which is easy to use and to install.

  1. Nine-year-old children use norm-based coding to visually represent facial expression.

    Science.gov (United States)

    Burton, Nichola; Jeffery, Linda; Skinner, Andrew L; Benton, Christopher P; Rhodes, Gillian

    2013-10-01

    Children are less skilled than adults at making judgments about facial expression. This could be because they have not yet developed adult-like mechanisms for visually representing faces. Adults are thought to represent faces in a multidimensional face-space, and have been shown to code the expression of a face relative to the norm or average face in face-space. Norm-based coding is economical and adaptive, and may be what makes adults more sensitive to facial expression than children. This study investigated the coding system that children use to represent facial expression. An adaptation aftereffect paradigm was used to test 24 adults and 18 children (9 years 2 months to 9 years 11 months old). Participants adapted to weak and strong antiexpressions. They then judged the expression of an average expression. Adaptation created aftereffects that made the test face look like the expression opposite that of the adaptor. Consistent with the predictions of norm-based but not exemplar-based coding, aftereffects were larger for strong than weak adaptors for both age groups. Results indicate that, like adults, children's coding of facial expressions is norm-based. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  3. McBits: fast constant-time code-based cryptography

    NARCIS (Netherlands)

    Bernstein, D.J.; Chou, T.; Schwabe, P.

    2015-01-01

    This paper presents extremely fast algorithms for code-based public-key cryptography, including full protection against timing attacks. For example, at a 2^128 security level, this paper achieves a reciprocal decryption throughput of just 60493 cycles (plus cipher cost etc.) on a single Ivy Bridge

  4. nRC: non-coding RNA Classifier based on structural features.

    Science.gov (United States)

    Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso

    2017-01-01

    Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.

  5. Researching on knowledge architecture of design by analysis based on ASME code

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2003-01-01

    The quality of knowledge-based system's knowledge architecture is one of decisive factors of knowledge-based system's validity and rationality. For designing the ASME code knowledge based system, this paper presents a knowledge acquisition method which is extracting knowledge through document analysis consulted domain experts' knowledge. Then the paper describes knowledge architecture of design by analysis based on the related rules in ASME code. The knowledge of the knowledge architecture is divided into two categories: one is empirical knowledge, and another is ASME code knowledge. Applied as the basement of the knowledge architecture, a general procedural process of design by analysis that is met the engineering design requirements and designers' conventional mode is generalized and explained detailed in the paper. For the sake of improving inference efficiency and concurrent computation of KBS, a kind of knowledge Petri net (KPN) model is proposed and adopted in expressing the knowledge architecture. Furthermore, for validating and verifying of the empirical rules, five knowledge validation and verification theorems are given in the paper. Moreover the research production is applicable to design the knowledge architecture of ASME codes or other engineering standards. (author)

  6. Novel power saving architecture for FBG based OCDMA code generation

    Science.gov (United States)

    Osadola, Tolulope B.; Idris, Siti K.; Glesk, Ivan

    2013-10-01

    A novel architecture for generating incoherent, 2-dimensional wavelength hopping-time spreading optical CDMA codes is presented. The architecture is designed to facilitate the reuse of optical source signal that is unused after an OCDMA code has been generated using fiber Bragg grating based encoders. Effective utilization of available optical power is therefore achieved by cascading several OCDMA encoders thereby enabling 3dB savings in optical power.

  7. CAMAC data acquisition system based on micro VAXII

    International Nuclear Information System (INIS)

    Yin Xijin; Shen Cuihua; Bai Xiaowei; Li Weisheng

    1993-01-01

    The CAMAC data acquisition system based on Micro VAXII Computer provides high-speed, Zero-suppressed, and 256-parameter CAMAC acquisition. It consists of three parts: control logic unit, CAMAC readout system and host computer system. When the control logical unit is triggered by external electronic selection signal, it produces a pilot signal to keep all of the parameters of a particular event together. Event-model data have been collected by using a CAMAC Fast Crate controller. The host computer system, in hard environment, is equipped with certain peripheral device. It includes the following: 1. at least two M990 GCR, 6250B/inch, magnetic tape driver operating at 75 inches per second or faster; 2. a Tektronix 4014 storage scope; 3. a laser printer, LND3-AE or copier which is capable of making hard-copies of Tektronix 4014 screen; 4. a control console device and a line printer; 5. x-press color graphics terminal; 6. DEC network. When the system is in real-time acquisition, it is able, on-line, to handle and analyse data stream, to monitor and control experiment and to display dynamically spectra on the Tektronix 4014

  8. Iterative channel decoding of FEC-based multiple-description codes.

    Science.gov (United States)

    Chang, Seok-Ho; Cosman, Pamela C; Milstein, Laurence B

    2012-03-01

    Multiple description coding has been receiving attention as a robust transmission framework for multimedia services. This paper studies the iterative decoding of FEC-based multiple description codes. The proposed decoding algorithms take advantage of the error detection capability of Reed-Solomon (RS) erasure codes. The information of correctly decoded RS codewords is exploited to enhance the error correction capability of the Viterbi algorithm at the next iteration of decoding. In the proposed algorithm, an intradescription interleaver is synergistically combined with the iterative decoder. The interleaver does not affect the performance of noniterative decoding but greatly enhances the performance when the system is iteratively decoded. We also address the optimal allocation of RS parity symbols for unequal error protection. For the optimal allocation in iterative decoding, we derive mathematical equations from which the probability distributions of description erasures can be generated in a simple way. The performance of the algorithm is evaluated over an orthogonal frequency-division multiplexing system. The results show that the performance of the multiple description codes is significantly enhanced.

  9. Probabilistic Decision Based Block Partitioning for Future Video Coding

    KAUST Repository

    Wang, Zhao

    2017-11-29

    In the latest Joint Video Exploration Team development, the quadtree plus binary tree (QTBT) block partitioning structure has been proposed for future video coding. Compared to the traditional quadtree structure of High Efficiency Video Coding (HEVC) standard, QTBT provides more flexible patterns for splitting the blocks, which results in dramatically increased combinations of block partitions and high computational complexity. In view of this, a confidence interval based early termination (CIET) scheme is proposed for QTBT to identify the unnecessary partition modes in the sense of rate-distortion (RD) optimization. In particular, a RD model is established to predict the RD cost of each partition pattern without the full encoding process. Subsequently, the mode decision problem is casted into a probabilistic framework to select the final partition based on the confidence interval decision strategy. Experimental results show that the proposed CIET algorithm can speed up QTBT block partitioning structure by reducing 54.7% encoding time with only 1.12% increase in terms of bit rate. Moreover, the proposed scheme performs consistently well for the high resolution sequences, of which the video coding efficiency is crucial in real applications.

  10. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  11. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  12. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... solutions. Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  13. ESP-TIMOC code manual

    International Nuclear Information System (INIS)

    Jaarsma, R.; Perlado, J.M.; Rief, H.

    1978-01-01

    ESP-TIMOC is an 'Event Scanning Program' to analyse the events (collision or boundary crossing parameters) of Monte Carlo particle transport problems. It is a modular program and belongs to the TIMOC code system. ESP-TIMOC is primarily designed to calculate the time dependent response functions such as energy dependent fluxes and currents at interfaces. An eventual extension to other quantities is simple and straight forward

  14. A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems

    Science.gov (United States)

    Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge

    Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.

  15. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  16. Demonstration of Emulator-Based Bayesian Calibration of Safety Analysis Codes: Theory and Formulation

    Directory of Open Access Journals (Sweden)

    Joseph P. Yurko

    2015-01-01

    Full Text Available System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC sampling feasible. This work uses Gaussian Process (GP based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.

  17. GRADSPMHD: A parallel MHD code based on the SPH formalism

    Science.gov (United States)

    Vanaverbeke, S.; Keppens, R.; Poedts, S.

    2014-03-01

    We present GRADSPMHD, a completely Lagrangian parallel magnetohydrodynamics code based on the SPH formalism. The implementation of the equations of SPMHD in the “GRAD-h” formalism assembles known results, including the derivation of the discretized MHD equations from a variational principle, the inclusion of time-dependent artificial viscosity, resistivity and conductivity terms, as well as the inclusion of a mixed hyperbolic/parabolic correction scheme for satisfying the ∇ṡB→ constraint on the magnetic field. The code uses a tree-based formalism for neighbor finding and can optionally use the tree code for computing the self-gravity of the plasma. The structure of the code closely follows the framework of our parallel GRADSPH FORTRAN 90 code which we added previously to the CPC program library. We demonstrate the capabilities of GRADSPMHD by running 1, 2, and 3 dimensional standard benchmark tests and we find good agreement with previous work done by other researchers. The code is also applied to the problem of simulating the magnetorotational instability in 2.5D shearing box tests as well as in global simulations of magnetized accretion disks. We find good agreement with available results on this subject in the literature. Finally, we discuss the performance of the code on a parallel supercomputer with distributed memory architecture. Catalogue identifier: AERP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 620503 No. of bytes in distributed program, including test data, etc.: 19837671 Distribution format: tar.gz Programming language: FORTRAN 90/MPI. Computer: HPC cluster. Operating system: Unix. Has the code been vectorized or parallelized?: Yes, parallelized using MPI. RAM: ˜30 MB for a

  18. Energy-Efficient Cluster Based Routing Protocol in Mobile Ad Hoc Networks Using Network Coding

    Directory of Open Access Journals (Sweden)

    Srinivas Kanakala

    2014-01-01

    Full Text Available In mobile ad hoc networks, all nodes are energy constrained. In such situations, it is important to reduce energy consumption. In this paper, we consider the issues of energy efficient communication in MANETs using network coding. Network coding is an effective method to improve the performance of wireless networks. COPE protocol implements network coding concept to reduce number of transmissions by mixing the packets at intermediate nodes. We incorporate COPE into cluster based routing protocol to further reduce the energy consumption. The proposed energy-efficient coding-aware cluster based routing protocol (ECCRP scheme applies network coding at cluster heads to reduce number of transmissions. We also modify the queue management procedure of COPE protocol to further improve coding opportunities. We also use an energy efficient scheme while selecting the cluster head. It helps to increase the life time of the network. We evaluate the performance of proposed energy efficient cluster based protocol using simulation. Simulation results show that the proposed ECCRP algorithm reduces energy consumption and increases life time of the network.

  19. GPU's for event reconstruction in the FairRoot framework

    International Nuclear Information System (INIS)

    Al-Turany, M; Uhlig, F; Karabowicz, R

    2010-01-01

    FairRoot is the simulation and analysis framework used by CBM and PANDA experiments at FAIR/GSI. The use of graphics processor units (GPUs) for event reconstruction in FairRoot will be presented. The fact that CUDA (Nvidia's Compute Unified Device Architecture) development tools work alongside the conventional C/C++ compiler, makes it possible to mix GPU code with general-purpose code for the host CPU, based on this some of the reconstruction tasks can be send to the graphic cards. Moreover, tasks that run on the GPU's can also run in emulation mode on the host CPU, which has the advantage that the same code is used on both CPU and GPU.

  20. Supporting Situated Learning Based on QR Codes with Etiquetar App: A Pilot Study

    Science.gov (United States)

    Camacho, Miguel Olmedo; Pérez-Sanagustín, Mar; Alario-Hoyos, Carlos; Soldani, Xavier; Kloos, Carlos Delgado; Sayago, Sergio

    2014-01-01

    EtiquetAR is an authoring tool for supporting the design and enactment of situated learning experiences based on QR tags. Practitioners use etiquetAR for creating, managing and personalizing collections of QR codes with special properties: (1) codes can have more than one link pointing at different multimedia resources, (2) codes can be updated…

  1. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  2. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  3. QR code based noise-free optical encryption and decryption of a gray scale image

    Science.gov (United States)

    Jiao, Shuming; Zou, Wenbin; Li, Xia

    2017-03-01

    In optical encryption systems, speckle noise is one major challenge in obtaining high quality decrypted images. This problem can be addressed by employing a QR code based noise-free scheme. Previous works have been conducted for optically encrypting a few characters or a short expression employing QR codes. This paper proposes a practical scheme for optically encrypting and decrypting a gray-scale image based on QR codes for the first time. The proposed scheme is compatible with common QR code generators and readers. Numerical simulation results reveal the proposed method can encrypt and decrypt an input image correctly.

  4. An Examination of the Performance Based Building Code on the Design of a Commercial Building

    Directory of Open Access Journals (Sweden)

    John Greenwood

    2012-11-01

    Full Text Available The Building Code of Australia (BCA is the principal code under which building approvals in Australia are assessed. The BCA adopted performance-based solutions for building approvals in 1996. Performance-based codes are based upon a set of explicit objectives, stated in terms of a hierarchy of requirements beginning with key general objectives. With this in mind, the research presented in this paper aims to analyse the impact of the introduction of the performance-based code within Western Australia to gauge the effect and usefulness of alternative design solutions in commercial construction using a case study project. The research revealed that there are several advantages to the use of alternative designs and that all parties, in general, are in favour of the performance-based building code of Australia. It is suggested that change in the assessment process to streamline the alternative design path is needed for the greater use of the performance-based alternative. With appropriate quality control measures, minor variations to the deemed-to-satisfy provisions could easily be managed by the current and future building surveying profession.

  5. Modelling of blackout sequence at Atucha-1 using the MARCH3 code

    International Nuclear Information System (INIS)

    Baron, J.; Bastianelli, B.

    1997-01-01

    This paper presents the modelling of a complete blackout at the Atucha-1 NPP as preliminary phase for a Level II safety probabilistic analysis. The MARCH3 code of the STCP (Source Term Code Package) is used, based on a plant model made in accordance with particularities of the plant design. The analysis covers all the severe accident phases. The results allow to view the time sequence of the events, and provide the basis for source term studies. (author). 6 refs., 2 figs

  6. On Rational Interpolation-Based List-Decoding and List-Decoding Binary Goppa Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Høholdt, Tom; Nielsen, Johan Sebastian Rosenkilde

    2013-01-01

    We derive the Wu list-decoding algorithm for generalized Reed–Solomon (GRS) codes by using Gröbner bases over modules and the Euclidean algorithm as the initial algorithm instead of the Berlekamp–Massey algorithm. We present a novel method for constructing the interpolation polynomial fast. We gi...... and a duality in the choice of parameters needed for decoding, both in the case of GRS codes and in the case of Goppa codes....

  7. RADIATION PROTECTION FOR HUMAN SPACEFLIGHT

    OpenAIRE

    Hellweg, C.E.; Baumstark-Khan, C.; Berger, T.

    2017-01-01

    Space is a special workplace not only because of microgravity and the dependency on life support systems, but also owing to a constant considerable exposure to a natural radiation source, the cosmic radiation. Galactic cosmic rays (GCR) and solar cosmic radiation (SCR) are the primary sources of the radiation field in space. Whereas the GCR component comprises all particles from protons to heavy ions with energies up to 10¹¹ GeV, the SCR component ejected in Solar Energetic Particle events (S...

  8. MIDAS/PK code development using point kinetics model

    International Nuclear Information System (INIS)

    Song, Y. M.; Park, S. H.

    1999-01-01

    In this study, a MIDAS/PK code has been developed for analyzing the ATWS (Anticipated Transients Without Scram) which can be one of severe accident initiating events. The MIDAS is an integrated computer code based on the MELCOR code to develop a severe accident risk reduction strategy by Korea Atomic Energy Research Institute. In the mean time, the Chexal-Layman correlation in the current MELCOR, which was developed under a BWR condition, is appeared to be inappropriate for a PWR. So as to provide ATWS analysis capability to the MIDAS code, a point kinetics module, PKINETIC, has first been developed as a stand-alone code whose reference model was selected from the current accident analysis codes. In the next step, the MIDAS/PK code has been developed via coupling PKINETIC with the MIDAS code by inter-connecting several thermal hydraulic parameters between the two codes. Since the major concern in the ATWS analysis is the primary peak pressure during the early few minutes into the accident, the peak pressure from the PKINETIC module and the MIDAS/PK are compared with the RETRAN calculations showing a good agreement between them. The MIDAS/PK code is considered to be valuable for analyzing the plant response during ATWS deterministically, especially for the early domestic Westinghouse plants which rely on the operator procedure instead of an AMSAC (ATWS Mitigating System Actuation Circuitry) against ATWS. This capability of ATWS analysis is also important from the view point of accident management and mitigation

  9. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  10. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  11. Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events.

    Science.gov (United States)

    Stekelenburg, Jeroen J; Vroomen, Jean

    2012-01-01

    In many natural audiovisual events (e.g., a clap of the two hands), the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have reported that there are distinct neural correlates of temporal (when) versus phonetic/semantic (which) content on audiovisual integration. Here we examined the effect of visual prediction of auditory location (where) in audiovisual biological motion stimuli by varying the spatial congruency between the auditory and visual parts. Visual stimuli were presented centrally, whereas auditory stimuli were presented either centrally or at 90° azimuth. Typical sub-additive amplitude reductions (AV - V audiovisual interaction was also found at 40-60 ms (P50) in the spatially congruent condition, while no effect of congruency was found on the suppression of the P2. This indicates that visual prediction of auditory location can be coded very early in auditory processing.

  12. Spatial gradients of GCR protons in the inner heliosphere derived from Ulysses COSPIN/KET and PAMELA measurements

    Science.gov (United States)

    Gieseler, J.; Heber, B.

    2016-05-01

    Context. During the transition from solar cycle 23 to 24 from 2006 to 2009, the Sun was in an unusual solar minimum with very low activity over a long period. These exceptional conditions included a very low interplanetary magnetic field (IMF) strength and a high tilt angle, which both play an important role in the modulation of galactic cosmic rays (GCR) in the heliosphere. Thus, the radial and latitudinal gradients of GCRs are very much expected to depend not only on the solar magnetic epoch, but also on the overall modulation level. Aims: We determine the non-local radial and the latitudinal gradients of protons in the rigidity range from ~0.45 to 2 GV. Methods: This was accomplished by using data from the satellite-borne experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) at Earth and the Kiel Electron Telescope (KET) onboard Ulysses on its highly inclined Keplerian orbit around the Sun with the aphelion at Jupiter's orbit. Results: In comparison to the previous A> 0 solar magnetic epoch, we find that the absolute value of the latitudinal gradient is lower at higher and higher at lower rigidities. This energy dependence is therefore a crucial test for models that describe the cosmic ray transport in the inner heliosphere.

  13. Fault tree analysis. Implementation of the WAM-codes

    International Nuclear Information System (INIS)

    Bento, J.P.; Poern, K.

    1979-07-01

    The report describes work going on at Studsvik at the implementation of the WAM code package for fault tree analysis. These codes originally developed under EPRI contract by Sciences Applications Inc, allow, in contrast with other fault tree codes, all Boolean operations, thus allowing modeling of ''NOT'' conditions and dependent components. To concretize the implementation of these codes, the auxiliary feed-water system of the Swedish BWR Oskarshamn 2 was chosen for the reliability analysis. For this system, both the mean unavailability and the probability density function of the top event - undesired event - of the system fault tree were calculated, the latter using a Monte-Carlo simulation technique. The present study is the first part of a work performed under contract with the Swedish Nuclear Power Inspectorate. (author)

  14. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  15. A comparative simulation of feed and bleed operation during the total loss of feedwater event by RELAP5/MOD3 and CEFLASH-4AS/REM computer codes

    International Nuclear Information System (INIS)

    Kwon, Y.M.; Ro, T.S.; Song, J.H.

    1995-01-01

    The Ulchin 3 and 4 nuclear power plants, which are two-loop 2,825 MW(thermal) pressurized water reactors designed by the Korea Atomic Energy Research Institute, adopted a safety depressurization system (SDS) to mitigate the beyond-design-basis event of a total loss of feedwater (TLOFW). A comparative simulation by the CEFLASH-4AS/REM and RELAP5/MOD3 computer codes for the TLOFW event without operator recovery and the TLOFW event with feed and bleed (F and B) operation is performed for Ulchin 3 and 4. In the analyses, the SDS bleed paths are modeled by orifices located on the top of the pressurizer, where the analytical area of the bleed path is based on the Ulchin 3 and 4 SDS design flow capacity. An additional case, where the SDS piping and valves are modeled explicitly, is considered for the RELAP5 analysis. The predictions by the CEFLASH-4AS/REM of the transient two-phase system behavior show good qualitative and quantitative agreement with those by the RELAP5 simulation. The RELAP5 case with explicit piping results in less repressurization and lower reactor coolant system pressure than that of the case without explicit SDS modeling. However, the two cases of RELAP5 analyses result in essentially the same transient scenarios for TLOFW with F and B operation. The results of the simulation demonstrate the validity of the Ulchin 3 and 4 design approach, which employs CEFLASH-4AS/REM computer code and SDS bleed paths modeled by orifices located on the top of the pressurizer. The results also indicate that the decay heat removal and core inventory makeup function can be successfully accomplished by F and B operation by using the SDS for Ulchin 3 and 4

  16. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  17. Analysis code for medium and small rupture accidents in ATR. LOTRAC/HEATUP

    International Nuclear Information System (INIS)

    1997-08-01

    In the evaluation of thermo-hydraulic and fuel temperature transient changes in the events which are classified in medium and small rupture accidents of reactor coolant loss that is the safety evaluation event of the ATR, the analysis code for synthetic thermo-hydraulic transient change at the time of medium and small ruptures LOTRAC and the detailed analysis code for fuel temperature HEATUP are used, respectively. By using the LOTAC, the thermo-hydraulic behavior of reactor cooling facility and the temperature behavior of fuel at the time of blow-down are analyzed, and also the characteristics of changing reactor thermal output is analyzed, considering the functioning characteristics of emergency core cooling system. Based on the data of thermo-hydraulic behavior obtained by the LOTRAC, the time of beginning the turn-around of fuel cladding tube temperature obtained by the data of ECCS pouring characteristics, the heat transfer rate after the turn-around and so on, the detailed temperature change of fuel elements is analyzed by the HEATUP, and the highest temperature and the amount of oxidation of fuel cladding tubes are determined. The LOTRAC code, the HEATUP code, various analysis models, and rupture simulation experiment are reported. (K.I.)

  18. Protograph based LDPC codes with minimum distance linearly growing with block size

    Science.gov (United States)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Sam; Thorpe, Jeremy

    2005-01-01

    We propose several LDPC code constructions that simultaneously achieve good threshold and error floor performance. Minimum distance is shown to grow linearly with block size (similar to regular codes of variable degree at least 3) by considering ensemble average weight enumerators. Our constructions are based on projected graph, or protograph, structures that support high-speed decoder implementations. As with irregular ensembles, our constructions are sensitive to the proportion of degree-2 variable nodes. A code with too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code with too many such nodes tends to not exhibit a minimum distance that grows linearly in block length. In this paper we also show that precoding can be used to lower the threshold of regular LDPC codes. The decoding thresholds of the proposed codes, which have linearly increasing minimum distance in block size, outperform that of regular LDPC codes. Furthermore, a family of low to high rate codes, with thresholds that adhere closely to their respective channel capacity thresholds, is presented. Simulation results for a few example codes show that the proposed codes have low error floors as well as good threshold SNFt performance.

  19. Simultaneity and Temporal Order Judgments Are Coded Differently and Change With Age: An Event-Related Potential Study

    Directory of Open Access Journals (Sweden)

    Aysha Basharat

    2018-04-01

    Full Text Available Multisensory integration is required for a number of daily living tasks where the inability to accurately identify simultaneity and temporality of multisensory events results in errors in judgment leading to poor decision-making and dangerous behavior. Previously, our lab discovered that older adults exhibited impaired timing of audiovisual events, particularly when making temporal order judgments (TOJs. Simultaneity judgments (SJs, however, were preserved across the lifespan. Here, we investigate the difference between the TOJ and SJ tasks in younger and older adults to assess neural processing differences between these two tasks and across the lifespan. Event-related potentials (ERPs were studied to determine between-task and between-age differences. Results revealed task specific differences in perceiving simultaneity and temporal order, suggesting that each task may be subserved via different neural mechanisms. Here, auditory N1 and visual P1 ERP amplitudes confirmed that unisensory processing of audiovisual stimuli did not differ between the two tasks within both younger and older groups, indicating that performance differences between tasks arise either from multisensory integration or higher-level decision-making. Compared to younger adults, older adults showed a sustained higher auditory N1 ERP amplitude response across SOAs, suggestive of broader response properties from an extended temporal binding window. Our work provides compelling evidence that different neural mechanisms subserve the SJ and TOJ tasks and that simultaneity and temporal order perception are coded differently and change with age.

  20. Status of the ASTEC integral code

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Jacq, F.; Allelein, H.J.

    2000-01-01

    The ASTEC (Accident Source Term Evaluation Code) integrated code is developed since 1997 in close collaboration by IPSN and GRS to predict an entire LWR severe accident sequence from the initiating event up to Fission Product (FP) release out of the containment. The applications of such a code are source term determination studies, scenario evaluations, accident management studies and Probabilistic Safety Assessment level 2 (PSA-2) studies. The version V0 of ASTEC is based on the RCS modules of the ESCADRE integrated code (IPSN) and on the upgraded RALOC and FIPLOC codes (GRS) for containment thermalhydraulics and aerosol behaviour. The latest version V0.2 includes the general feed-back from the overall validation performed in 1998 (25 separate-effect experiments, PHEBUS.FP FPT1 integrated experiment), some modelling improvements (i.e. silver-iodine reactions in the containment sump), and the implementation of the main safety systems for Severe Accident Management. Several reactor-applications are under way on French and German PWR, and on VVER-1000, all with a multi-compartment configuration of the containment. The total IPSN-GRS manpower involved in ASTEC project is today about 20 men/year. The main evolution of the next version V1, foreseen end of 2001, concerns the integration of the front-end phase and the improvement of the in-vessel degradation late-phase modelling. (author)

  1. A code for simulation of human failure events in nuclear power plants: SIMPROC

    International Nuclear Information System (INIS)

    Gil, Jesus; Fernandez, Ivan; Murcia, Santiago; Gomez, Javier; Marrao, Hugo; Queral, Cesar; Exposito, Antonio; Rodriguez, Gabriel; Ibanez, Luisa; Hortal, Javier; Izquierdo, Jose M.; Sanchez, Miguel; Melendez, Enrique

    2011-01-01

    Over the past years, many Nuclear Power Plant organizations have performed Probabilistic Safety Assessments to identify and understand key plant vulnerabilities. As part of enhancing the PSA quality, the Human Reliability Analysis is essential to make a realistic evaluation of safety and about the potential facility's weaknesses. Moreover, it has to be noted that HRA continues to be a large source of uncertainty in the PSAs. Within their current joint collaborative activities, Indizen, Universidad Politecnica de Madrid and Consejo de Seguridad Nuclear have developed the so-called SIMulator of PROCedures (SIMPROC), a tool aiming at simulate events related with human actions and able to interact with a plant simulation model. The tool helps the analyst to quantify the importance of human actions in the final plant state. Among others, the main goal of SIMPROC is to check the Emergency Operating Procedures being used by operating crew in order to lead the plant to a safe shutdown plant state. Currently SIMPROC is coupled with the SCAIS software package, but the tool is flexible enough to be linked to other plant simulation codes. SIMPROC-SCAIS applications are shown in the present article to illustrate the tool performance. The applications were developed in the framework of the Nuclear Energy Agency project on Safety Margin Assessment and Applications (SM2A). First an introductory example was performed to obtain the damage domain boundary of a selected sequence from a SBLOCA. Secondly, the damage domain area of a selected sequence from a loss of Component Cooling Water with a subsequent seal LOCA was calculated. SIMPROC simulates the corresponding human actions in both cases. The results achieved shown how the system can be adapted to a wide range of purposes such as Dynamic Event Tree delineation, Emergency Operating Procedures and damage domain search.

  2. High-Capacity Quantum Secure Direct Communication Based on Quantum Hyperdense Coding with Hyperentanglement

    International Nuclear Information System (INIS)

    Wang Tie-Jun; Li Tao; Du Fang-Fang; Deng Fu-Guo

    2011-01-01

    We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication (QSDC) protocol based on this quantum hyperdense coding protocol. This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system. Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present. (general)

  3. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  4. Financial impact of inaccurate Adverse Event recording post Hip Fracture surgery: Addendum to 'Adverse event recording post hip fracture surgery'.

    Science.gov (United States)

    Lee, Matthew J; Doody, Kevin; Mohamed, Khalid M S; Butler, Audrey; Street, John; Lenehan, Brian

    2018-02-15

    A study in 2011 by (Doody et al. Ir Med J 106(10):300-302, 2013) looked at comparing inpatient adverse events recorded prospectively at the point of care, with adverse events recorded by the national Hospital In-Patient Enquiry (HIPE) System. In the study, a single-centre University Hospital in Ireland treating acute hip fractures in an orthopaedic unit recorded 39 patients over a 2-month (August-September 2011) period, with 55 adverse events recorded prospectively in contrast to the HIPE record of 13 (23.6%) adverse events. With the recent change in the Irish hospital funding model from block grant to an 'activity-based funding' on the basis of case load and case complexity, the hospital financial allocation is dependent on accurate case complexity coding. A retrospective assessment of the financial implications of the two methods of adverse incident recording was carried out. A total of €39,899 in 'missed funding' for 2 months was calculated when the ward-based, prospectively collected data was compared to the national HIPE data. Accurate data collection is paramount in facilitating activity-based funding, to improve patient care and ensure the appropriate allocation of resources.

  5. Development of a potential based code for MHD analysis of LLCB TBM

    International Nuclear Information System (INIS)

    Bhuyan, P.J.; Goswami, K.S.

    2010-01-01

    A two dimensional solver is developed for MHD flows with low magnetic Reynolds' number based on the electrostatic potential formulation for the Lorentz forces and current densities which will be used to calculate the MHD pressure drop inside the channels of liquid breeder based Test Blanket Modules (TBMs). The flow geometry is assumed to be rectangular and perpendicular to the flow direction, with flow and electrostatic potential variations along the flow direction neglected. A structured, non-uniform, collocated grid is used in the calculation, where the velocity (u), pressure (p) and electrostatic potential (φ) are calculated at the cell centers, whereas the current densities are calculated at the cell faces. Special relaxation techniques are employed in calculating the electrostatic potential for ensuring the divergence-free condition for current density. The code is benchmarked over a square channel for various Hartmann numbers up to 25,000 with and without insulation coatings by (i) comparing the pressure drop with the approximate analytical results found in literature and (ii) comparing the pressure drop with the ones obtained in our previous calculations based on the induction formulation for the electromagnetic part. Finally the code is used to determine the MHD pressure drop in case of LLCB TBM. The code gives similar results as obtained by us in our previous calculations based on the induction formulation. However, the convergence is much faster in case of potential based code.

  6. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  7. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  8. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  9. Fractal Image Coding Based on a Fitting Surface

    Directory of Open Access Journals (Sweden)

    Sheng Bi

    2014-01-01

    Full Text Available A no-search fractal image coding method based on a fitting surface is proposed. In our research, an improved gray-level transform with a fitting surface is introduced. One advantage of this method is that the fitting surface is used for both the range and domain blocks and one set of parameters can be saved. Another advantage is that the fitting surface can approximate the range and domain blocks better than the previous fitting planes; this can result in smaller block matching errors and better decoded image quality. Since the no-search and quadtree techniques are adopted, smaller matching errors also imply less number of blocks matching which results in a faster encoding process. Moreover, by combining all the fitting surfaces, a fitting surface image (FSI is also proposed to speed up the fractal decoding. Experiments show that our proposed method can yield superior performance over the other three methods. Relative to range-averaged image, FSI can provide faster fractal decoding process. Finally, by combining the proposed fractal coding method with JPEG, a hybrid coding method is designed which can provide higher PSNR than JPEG while maintaining the same Bpp.

  10. Extreme Energy Events Monitoring report

    CERN Document Server

    Baimukhamedova, Nigina

    2015-01-01

    Following paper reflects the progress I made on Summer Student Program within Extreme Energy Events Monitor project I was working on. During 8 week period I managed to build a simple detector system that is capable of triggering events similar to explosions (sudden change in sound levels) and measuring approximate location of the event. Source codes are available upon request and settings described further.

  11. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  12. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  13. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  14. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  15. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  16. Gröbner Bases, Coding, and Cryptography

    CERN Document Server

    Sala, Massimiliano; Perret, Ludovic

    2009-01-01

    Coding theory and cryptography allow secure and reliable data transmission, which is at the heart of modern communication. This book offers a comprehensive overview on the application of commutative algebra to coding theory and cryptography. It analyzes important properties of algebraic/geometric coding systems individually.

  17. Code accuracy evaluation of ISP 35 calculations based on NUPEC M-7-1 test

    International Nuclear Information System (INIS)

    Auria, F.D.; Oriolo, F.; Leonardi, M.; Paci, S.

    1995-01-01

    Quantitative evaluation of code uncertainties is a necessary step in the code assessment process, above all if best-estimate codes are utilised for licensing purposes. Aiming at quantifying the code accuracy, an integral methodology based on the Fast Fourier Transform (FFT) has been developed at the University of Pisa (DCMN) and has been already applied to several calculations related to primary system test analyses. This paper deals with the first application of the FFT based methodology to containment code calculations based on a hydrogen mixing and distribution test performed in the NUPEC (Nuclear Power Engineering Corporation) facility. It is referred to pre-test and post-test calculations submitted for the International Standard Problem (ISP) n. 35. This is a blind exercise, simulating the effects of steam injection and spray behaviour on gas distribution and mixing. The result of the application of this methodology to nineteen selected variables calculated by ten participants are here summarized, and the comparison (where possible) of the accuracy evaluated for the pre-test and for the post-test calculations of a same user is also presented. (author)

  18. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...

  19. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  20. Simulated Response of a Tissue-equivalent Proportional Counter on the Surface of Mars.

    Science.gov (United States)

    Northum, Jeremy D; Guetersloh, Stephen B; Braby, Leslie A; Ford, John R

    2015-10-01

    Uncertainties persist regarding the assessment of the carcinogenic risk associated with galactic cosmic ray (GCR) exposure during a mission to Mars. The GCR spectrum peaks in the range of 300(-1) MeV n to 700 MeV n(-1) and is comprised of elemental ions from H to Ni. While Fe ions represent only 0.03% of the GCR spectrum in terms of particle abundance, they are responsible for nearly 30% of the dose equivalent in free space. Because of this, radiation biology studies focusing on understanding the biological effects of GCR exposure generally use Fe ions. Acting as a thin shield, the Martian atmosphere alters the GCR spectrum in a manner that significantly reduces the importance of Fe ions. Additionally, albedo particles emanating from the regolith complicate the radiation environment. The present study uses the Monte Carlo code FLUKA to simulate the response of a tissue-equivalent proportional counter on the surface of Mars to produce dosimetry quantities and microdosimetry distributions. The dose equivalent rate on the surface of Mars was found to be 0.18 Sv y(-1) with an average quality factor of 2.9 and a dose mean lineal energy of 18.4 keV μm(-1). Additionally, albedo neutrons were found to account for 25% of the dose equivalent. It is anticipated that these data will provide relevant starting points for use in future risk assessment and mission planning studies.

  1. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  2. A new 3D maser code applied to flaring events

    Science.gov (United States)

    Gray, M. D.; Mason, L.; Etoka, S.

    2018-06-01

    We set out the theory and discretization scheme for a new finite-element computer code, written specifically for the simulation of maser sources. The code was used to compute fractional inversions at each node of a 3D domain for a range of optical thicknesses. Saturation behaviour of the nodes with regard to location and optical depth was broadly as expected. We have demonstrated via formal solutions of the radiative transfer equation that the apparent size of the model maser cloud decreases as expected with optical depth as viewed by a distant observer. Simulations of rotation of the cloud allowed the construction of light curves for a number of observable quantities. Rotation of the model cloud may be a reasonable model for quasi-periodic variability, but cannot explain periodic flaring.

  3. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  4. A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Directory of Open Access Journals (Sweden)

    Shailesh Kamble

    2016-12-01

    Full Text Available Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression.

  5. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    Science.gov (United States)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  6. Event-by-Event Simulations of Early Gluon Fields in High Energy Nuclear Collisions

    Science.gov (United States)

    Nickel, Matthew; Rose, Steven; Fries, Rainer

    2017-09-01

    Collisions of heavy ions are carried out at ultra relativistic speeds at the Relativistic Heavy Ion Collider and the Large Hadron Collider to create Quark Gluon Plasma. The earliest stages of such collisions are dominated by the dynamics of classical gluon fields. The McLerran-Venugopalan (MV) model of color glass condensate provides a model for this process. Previous research has provided an analytic solution for event averaged observables in the MV model. Using the High Performance Research Computing Center (HPRC) at Texas A&M, we have developed a C++ code to explicitly calculate the initial gluon fields and energy momentum tensor event by event using the analytic recursive solution. The code has been tested against previously known analytic results up to fourth order. We have also have been able to test the convergence of the recursive solution at high orders in time and studied the time evolution of color glass condensate.

  7. RNA editing differently affects protein-coding genes in D. melanogaster and H. sapiens.

    Science.gov (United States)

    Grassi, Luigi; Leoni, Guido; Tramontano, Anna

    2015-07-14

    When an RNA editing event occurs within a coding sequence it can lead to a different encoded amino acid. The biological significance of these events remains an open question: they can modulate protein functionality, increase the complexity of transcriptomes or arise from a loose specificity of the involved enzymes. We analysed the editing events in coding regions that produce or not a change in the encoded amino acid (nonsynonymous and synonymous events, respectively) in D. melanogaster and in H. sapiens and compared them with the appropriate random models. Interestingly, our results show that the phenomenon has rather different characteristics in the two organisms. For example, we confirm the observation that editing events occur more frequently in non-coding than in coding regions, and report that this effect is much more evident in H. sapiens. Additionally, in this latter organism, editing events tend to affect less conserved residues. The less frequently occurring editing events in Drosophila tend to avoid drastic amino acid changes. Interestingly, we find that, in Drosophila, changes from less frequently used codons to more frequently used ones are favoured, while this is not the case in H. sapiens.

  8. A Secure Network Coding Based on Broadcast Encryption in SDN

    Directory of Open Access Journals (Sweden)

    Yue Chen

    2016-01-01

    Full Text Available By allowing intermediate nodes to encode the received packets before sending them out, network coding improves the capacity and robustness of multicast applications. But it is vulnerable to the pollution attacks. Some signature schemes were proposed to thwart such attacks, but most of them need to be homomorphic that the keys cannot be generated and managed easily. In this paper, we propose a novel fast and secure switch network coding multicast (SSNC on the software defined networks (SDN. In our scheme, the complicated secure multicast management was separated from the fast data transmission based on the SDN. Multiple multicasts will be aggregated to one multicast group according to the requirements of services and the network status. Then, the controller will route aggregated multicast group with network coding; only the trusted switch will be allowed to join the network coding by using broadcast encryption. The proposed scheme can use the traditional cryptography without homomorphy, which greatly reduces the complexity of the computation and improves the efficiency of transmission.

  9. The sequence coding and search system: an approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1990-01-01

    The U.S. Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedback of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrence of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objectives of SCSS, the information it contains, and the format and approach for constructing SCSS event sequences. Examples are presented demonstrating the use of SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted. Considerable experience has been gained in the areas of evolving and changing data requirements, staffing requirements, and quality control and quality assurance procedures for addressing consistency, software/hardware considerations for developing and maintaining a complex system, documentation requirements, and end-user needs. Two other approaches for constructing and evaluating event sequences are examined including the Accident Precursor Program (ASP) where sequences having the potential for core damage are identified and analyzed, and the Significant Event Compilation Tree

  10. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  11. A Secure Network Coding-based Data Gathering Model and Its Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Qian Xiao

    2012-09-01

    Full Text Available To provide security for data gathering based on network coding in wireless sensor networks (WSNs, a secure network coding-based data gathering model is proposed, and a data-privacy preserving and pollution preventing (DPPaamp;PP protocol using network coding is designed. DPPaamp;PP makes use of a new proposed pollution symbol selection and pollution (PSSP scheme based on a new obfuscation idea to pollute existing symbols. Analyses of DPPaamp;PP show that it not only requires low overhead on computation and communication, but also provides high security on resisting brute-force attacks.

  12. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  13. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  14. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  15. Monte Carlo event generator MCMHA for high energy hadron-nucleus collisions and intranuclear cascade interactions

    International Nuclear Information System (INIS)

    Iga, Y.; Hamatsu, R.; Yamazaki, S.

    1988-01-01

    The Monte Carlo event generator for high energy hadron-nucleus (h-A) collisions has been developed which is based on the multi-chain model. The concept of formation zone and the cascade interactions of secondary particles are properly taken into account in this Monte Carlo code. Comparing the results of this code with experimental data, the importance of intranuclear cascade interactions becomes very clear. (orig.)

  16. Thermal-hydraulic analysis code development and application to passive safety reactor at JAERI

    International Nuclear Information System (INIS)

    Araya, F.

    1995-01-01

    After a brief overview of safety assessment process, the author describes the LOCA analysis code system developed in JAERI. It comprises audit calculation code (WREM, WREM-J2, Japanese own code and BE codes (2D/3D, ICAP, ROSA). The codes are applied to development of Japanese passive safety reactor concept JPSR. Special attention is paid to the passive heat removal system and phenomena considered to occur under loss of heat sink event. Examples of LOCA analysis based on operation of JPSR for the cases of heat removal by upper RHR and heat removal from core to atmosphere are given. Experiments for multi-dimensional flow field in RPV and steam condensation in water pool are used for understanding the phenomena in passive safety reactors. The report is in a poster form only. 1 tab., 13 figs

  17. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  18. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  19. Code development of the national hemovigilance system and expansion strategies for hospital blood banks

    Directory of Open Access Journals (Sweden)

    Kim Jeongeun

    2012-01-01

    Full Text Available Objectives : The aims of this study were to develop reportable event codes that are applicable to the national hemovigilance systems for hospital blood banks, and to present expansion strategies for the blood banks. Materials and Methods : The data were obtained from a literature review and expert consultation, followed by adding to and revising the established hemovigilance code system and guidelines to develop reportable event codes for hospital blood banks. The Medical Error Reporting System-Transfusion Medicine developed in the US and other codes of reportable events were added to the Korean version of the Biologic Products Deviation Report (BPDR developed by the Korean Red Cross Blood Safety Administration, then using these codes, mapping work was conducted. We deduced outcomes suitable for practice, referred to the results of the advisory councils, and conducted a survey with experts and blood banks practitioners. Results : We developed reportable event codes that were applicable to hospital blood banks and could cover blood safety - from blood product safety to blood transfusion safety - and also presented expansion strategies for hospital blood banks. Conclusion : It was necessary to add 10 major categories to the blood transfusion safety stage and 97 reportable event codes to the blood safety stage. Contextualized solutions were presented on 9 categories of expansion strategies of hemovigilance system for the hospital blood banks.

  20. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  1. Validation and applicability of the 3D core kinetics and thermal hydraulics coupled code SPARKLE

    International Nuclear Information System (INIS)

    Miyata, Manabu; Maruyama, Manabu; Ogawa, Junto; Otake, Yukihiko; Miyake, Shuhei; Tabuse, Shigehiko; Tanaka, Hirohisa

    2009-01-01

    The SPARKLE code is a coupled code system based on three individual codes whose physical models have already been verified and validated. Mitsubishi Heavy Industries (MHI) confirmed the coupling calculation, including data transfer and the total reactor coolant system (RCS) behavior of the SPARKLE code. The confirmation uses the OECD/NEA MSLB benchmark problem, which is based on Three Mile Island Unit 1 (TMI-1) nuclear power plant data. This benchmark problem has been used to verify coupled codes developed and used by many organizations. Objectives of the benchmark program are as follows. Phase 1 is to compare the results of the system transient code using point kinetics. Phase 2 is to compare the results of the coupled three-dimensional (3D) core kinetics code and 3D core thermal-hydraulics (T/H) code, and Phase 3 is to compare the results of the combined coupled system transient code, 3D core kinetics code, and 3D core T/H code as a total validation of the coupled calculation. The calculation results of the SPARKLE code indicate good agreement with other benchmark participants' results. Therefore, the SPARKLE code is validated through these benchmark problems. In anticipation of applying the SPARKLE code to licensing analyses, MHI and Japanese PWR utilities have established a safety analysis method regarding the calculation conditions such as power distributions, reactivity coefficients, and event-specific features. (author)

  2. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  3. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  4. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  5. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  6. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  7. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  8. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  9. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  10. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  11. Biases in detection of apparent "weekend effect" on outcome with administrative coding data: population based study of stroke.

    Science.gov (United States)

    Li, Linxin; Rothwell, Peter M

    2016-05-16

     To determine the accuracy of coding of admissions for stroke on weekdays versus weekends and any impact on apparent outcome.  Prospective population based stroke incidence study and a scoping review of previous studies of weekend effects in stroke.  Primary and secondary care of all individuals registered with nine general practices in Oxfordshire, United Kingdom (OXVASC, the Oxford Vascular Study).  All patients with clinically confirmed acute stroke in OXVASC identified with multiple overlapping methods of ascertainment in 2002-14 versus all acute stroke admissions identified by hospital diagnostic and mortality coding alone during the same period.  Accuracy of administrative coding data for all patients with confirmed stroke admitted to hospital in OXVASC. Difference between rates of "false positive" or "false negative" coding for weekday and weekend admissions. Impact of inaccurate coding on apparent case fatality at 30 days in weekday versus weekend admissions. Weekend effects on outcomes in patients with confirmed stroke admitted to hospital in OXVASC and impacts of other potential biases compared with those in the scoping review.  Among 92 728 study population, 2373 episodes of acute stroke were ascertained in OXVASC, of which 826 (34.8%) mainly minor events were managed without hospital admission, 60 (2.5%) occurred out of the area or abroad, and 195 (8.2%) occurred in hospital during an admission for a different reason. Of 1292 local hospital admissions for acute stroke, 973 (75.3%) were correctly identified by administrative coding. There was no bias in distribution of weekend versus weekday admission of the 319 strokes missed by coding. Of 1693 admissions for stroke identified by coding, 1055 (62.3%) were confirmed to be acute strokes after case adjudication. Among the 638 false positive coded cases, patients were more likely to be admitted on weekdays than at weekends (536 (41.0%) v 102 (26.5%); Pcoded acute stroke admissions and false positive

  12. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    Directory of Open Access Journals (Sweden)

    Kai Lin

    2016-07-01

    Full Text Available With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC. The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods.

  13. Quantum BCH Codes Based on Spectral Techniques

    International Nuclear Information System (INIS)

    Guo Ying; Zeng Guihua

    2006-01-01

    When the time variable in quantum signal processing is discrete, the Fourier transform exists on the vector space of n-tuples over the Galois field F 2 , which plays an important role in the investigation of quantum signals. By using Fourier transforms, the idea of quantum coding theory can be described in a setting that is much different from that seen that far. Quantum BCH codes can be defined as codes whose quantum states have certain specified consecutive spectral components equal to zero and the error-correcting ability is also described by the number of the consecutive zeros. Moreover, the decoding of quantum codes can be described spectrally with more efficiency.

  14. A Novel Error Resilient Scheme for Wavelet-based Image Coding Over Packet Networks

    OpenAIRE

    WenZhu Sun; HongYu Wang; DaXing Qian

    2012-01-01

    this paper presents a robust transmission strategy for wavelet based scalable bit stream over packet erasure channel. By taking the advantage of the bit plane coding and the multiple description coding, the proposed strategy adopts layered multiple description coding (LMDC) for the embedded wavelet coders to improve the error resistant capability of the important bit planes in the meaning of D(R) function. Then, the post-compression rate-distortion (PCRD) optimization process is used to impro...

  15. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  16. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  17. Modeling radiation belt dynamics using a 3-D layer method code

    Science.gov (United States)

    Wang, C.; Ma, Q.; Tao, X.; Zhang, Y.; Teng, S.; Albert, J. M.; Chan, A. A.; Li, W.; Ni, B.; Lu, Q.; Wang, S.

    2017-08-01

    A new 3-D diffusion code using a recently published layer method has been developed to analyze radiation belt electron dynamics. The code guarantees the positivity of the solution even when mixed diffusion terms are included. Unlike most of the previous codes, our 3-D code is developed directly in equatorial pitch angle (α0), momentum (p), and L shell coordinates; this eliminates the need to transform back and forth between (α0,p) coordinates and adiabatic invariant coordinates. Using (α0,p,L) is also convenient for direct comparison with satellite data. The new code has been validated by various numerical tests, and we apply the 3-D code to model the rapid electron flux enhancement following the geomagnetic storm on 17 March 2013, which is one of the Geospace Environment Modeling Focus Group challenge events. An event-specific global chorus wave model, an AL-dependent statistical plasmaspheric hiss wave model, and a recently published radial diffusion coefficient formula from Time History of Events and Macroscale Interactions during Substorms (THEMIS) statistics are used. The simulation results show good agreement with satellite observations, in general, supporting the scenario that the rapid enhancement of radiation belt electron flux for this event results from an increased level of the seed population by radial diffusion, with subsequent acceleration by chorus waves. Our results prove that the layer method can be readily used to model global radiation belt dynamics in three dimensions.

  18. Event Reconstruction in the PandaRoot framework

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2012-01-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  19. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  20. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  1. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  2. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  3. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  4. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  5. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  6. VLSI Architectures for Sliding-Window-Based Space-Time Turbo Trellis Code Decoders

    Directory of Open Access Journals (Sweden)

    Georgios Passas

    2012-01-01

    Full Text Available The VLSI implementation of SISO-MAP decoders used for traditional iterative turbo coding has been investigated in the literature. In this paper, a complete architectural model of a space-time turbo code receiver that includes elementary decoders is presented. These architectures are based on newly proposed building blocks such as a recursive add-compare-select-offset (ACSO unit, A-, B-, Γ-, and LLR output calculation modules. Measurements of complexity and decoding delay of several sliding-window-technique-based MAP decoder architectures and a proposed parameter set lead to defining equations and comparison between those architectures.

  7. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  8. DEVELOPMENT OF SALES APPLICATION OF PREPAID ELECTRICITY VOUCHER BASED ON ANFROID PLATFORM USING QUICK RESPONSE CODE (QR CODE

    Directory of Open Access Journals (Sweden)

    Ricky Akbar

    2017-09-01

    Full Text Available Perusahaan Listrik Negara (PLN has implemented a smart electricity system or prepaid electricity. The customers pay the electricity voucher first before use the electricity. The token contained in electricity voucher that has been purchased by the customer is inserted into the Meter Prabayar (MPB installed in the location of customers. When a customer purchases a voucher, it will get a receipt that contains all of the customer's identity and the 20-digit of voucher code (token to be entered into MPB as a substitute for electrical energy credit. Receipts obtained by the customer is certainly vulnerable to loss, or hijacked by unresponsible parties. In this study, authors designed and develop an android based application by utilizing QR code technology as a replacement for the receipt of prepaid electricity credit which contains the identity of the customer and the 20-digit voucher code. The application is developed by implemented waterfall methodology. The implementation process of the waterfall methods used, are (1 analysis of functional requirement of the system by conducting a preliminary study and data collection based on field studies and literature, (2 system design by using UML diagrams and Business Process Model Notation (BPMN and Entity Relationship diagram (ERD, (3 design implementation by using OOP (Object Oriented programming technique. Web application is developed by using laravel PHP framework and database MySQL while mobile application is developed by using B4A (4 developed system is tested by using blackbox method testing. Final result of this research is a Web and mobile applications for the sale of electricityvoucher by QR Code technology.

  9. Experimental annotation of post-translational features and translated coding regions in the pathogen Salmonella Typhimurium

    Energy Technology Data Exchange (ETDEWEB)

    Ansong, Charles; Tolic, Nikola; Purvine, Samuel O.; Porwollik, Steffen; Jones, Marcus B.; Yoon, Hyunjin; Payne, Samuel H.; Martin, Jessica L.; Burnet, Meagan C.; Monroe, Matthew E.; Venepally, Pratap; Smith, Richard D.; Peterson, Scott; Heffron, Fred; Mcclelland, Michael; Adkins, Joshua N.

    2011-08-25

    Complete and accurate genome annotation is crucial for comprehensive and systematic studies of biological systems. For example systems biology-oriented genome scale modeling efforts greatly benefit from accurate annotation of protein-coding genes to develop proper functioning models. However, determining protein-coding genes for most new genomes is almost completely performed by inference, using computational predictions with significant documented error rates (> 15%). Furthermore, gene prediction programs provide no information on biologically important post-translational processing events critical for protein function. With the ability to directly measure peptides arising from expressed proteins, mass spectrometry-based proteomics approaches can be used to augment and verify coding regions of a genomic sequence and importantly detect post-translational processing events. In this study we utilized “shotgun” proteomics to guide accurate primary genome annotation of the bacterial pathogen Salmonella Typhimurium 14028 to facilitate a systems-level understanding of Salmonella biology. The data provides protein-level experimental confirmation for 44% of predicted protein-coding genes, suggests revisions to 48 genes assigned incorrect translational start sites, and uncovers 13 non-annotated genes missed by gene prediction programs. We also present a comprehensive analysis of post-translational processing events in Salmonella, revealing a wide range of complex chemical modifications (70 distinct modifications) and confirming more than 130 signal peptide and N-terminal methionine cleavage events in Salmonella. This study highlights several ways in which proteomics data applied during the primary stages of annotation can improve the quality of genome annotations, especially with regards to the annotation of mature protein products.

  10. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  12. Frequencies and trends of significant characteristics of reported events in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Farber, G.; Matthes, H. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koln (Germany)

    2001-07-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  13. Frequencies and trends of significant characteristics of reported events in Germany

    International Nuclear Information System (INIS)

    Farber, G.; Matthes, H.

    2001-01-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  14. Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics

    Science.gov (United States)

    Ciappina, M. F.; Kirchner, T.; Schulz, M.

    2010-04-01

    We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double

  15. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  16. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  17. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    Science.gov (United States)

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  18. Coding response to a case-mix measurement system based on multiple diagnoses.

    Science.gov (United States)

    Preyra, Colin

    2004-08-01

    To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.

  19. Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses

    Science.gov (United States)

    Preyra, Colin

    2004-01-01

    Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940

  20. International Common Cause Failure Data Exchange (ICDE). General Coding Guidelines - Updated Version, October 2011

    International Nuclear Information System (INIS)

    Johanson, Gunnar; Werner, Wolfgang; Capote, Marina Concepcion; Kreuser, Albert

    2012-01-01

    Several OECD/NEA member countries have established the International Common-Cause Failure Data Exchange Project ('ICDE Project') to encourage multilateral cooperation in the collection and analysis of data relating to Common-Cause Failure (CCF) events. The objectives of the ICDE Project are to: a) Collect and analyse CCF events over the long term so as to better understand such events, their causes, and their prevention. b) Generate qualitative insights into the root causes of CCF events which can then be used to derive approaches or mechanisms for their prevention or for mitigating their consequences. c) Establish a mechanism for the efficient feedback of experience gained in connection with CCF phenomena, including the development of defenses against their occurrence, such as indicators for risk based inspections. d) Record event attributes to facilitate quantification of CCF frequencies when so decided by the Project Working Group. The ICDE Project is envisaged to comprise all possible events of interest, including both complete and partial ICDE events. The ICDE Project will cover the key components of the main safety systems. Presently, the components listed below are included in the ICDE Project. Data have been collected for the six first components in the list: Centrifugal pumps, Diesel generators, Motor operated valves, Safety relief valves/power operated relief valves, Check valves, Batteries, Level measurement, Breakers, Control rod drive assemblies. Others will be added to this list later on. In this component coding guidelines, explanations on the ICDE general coding format are given. The guide reflects present experience with the data format and with the collected data. Further interpretations and clarifications will be added, should they become necessary. For each component analysed in the ICDE project, separate coding guidance is provided in the appendices ICDECG 01-06, specifying details relevant to the respective components

  1. International common-cause failure data exchange. ICDE general coding guidelines - Technical note

    International Nuclear Information System (INIS)

    Johanson, Gunnar; Werner, Wolfgang; Concepcion Capote, Marina; Kreuser, Albert; Rasmuson, Dale; Jonsson, Esther; Pereira Pagan, Begona; Tirira, Jorge; Morris, Ian; Morales, Rosa; Oxberry, Anna; Kreuser, Albert

    2004-01-01

    Several Member countries of the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) have established the International Common-Cause Failure Data Exchange Project (ICDE Project) to encourage multilateral co-operation in the collection and analysis of data relating to Common-Cause Failure (CCF) events. The objectives of the ICDE Project are to: a) Collect and analyse CCF events over the long term so as to better understand such events, their causes, and their prevention; b) Generate qualitative insights into the root causes of CCF events which can then be used to derive approaches or mechanisms for their prevention or for mitigating their consequences; c) Establish a mechanism for the efficient feedback of experience gained in connection with CCF phenomena, including the development of defences against their occurrence, such as indicators for risk based inspections; and d) Record event attributes to facilitate quantification of CCF frequencies when so decided by the Project Working Group. The ICDE Project is envisaged to comprise all possible events of interest, including both complete and partial ICDE events. The ICDE Project will cover the key components of the main safety systems. Presently, the components listed below are included in the ICDE Project. Data have been collected for the six first components in the list: Centrifugal pumps, Diesel generators, Motor operated valves, Safety relief valves/power operated relief valves, Check valves, Batteries, Level measurement, Breakers, Control rod drive assemblies. Others will be added to this list later on. In this component coding guidelines, explanations on the ICDE General coding format are given. The guide reflects present experience with the data format and with the collected data. Further interpretations and clarifications will be added, should they become necessary. For each component analysed in the ICDE project, separate coding guidance is provided in the appendices

  2. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    Science.gov (United States)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  3. Spike-based population coding and working memory.

    Directory of Open Access Journals (Sweden)

    Martin Boerlin

    2011-02-01

    Full Text Available Compelling behavioral evidence suggests that humans can make optimal decisions despite the uncertainty inherent in perceptual or motor tasks. A key question in neuroscience is how populations of spiking neurons can implement such probabilistic computations. In this article, we develop a comprehensive framework for optimal, spike-based sensory integration and working memory in a dynamic environment. We propose that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons. As a result, these networks can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Importantly, we propose that population responses and persistent working memory states represent entire probability distributions and not only single stimulus values. These memories are reflected by sustained, asynchronous patterns of activity which make relevant information available to downstream neurons within their short time window of integration. Model neurons act as predictive encoders, only firing spikes which account for new information that has not yet been signaled. Thus, spike times signal deterministically a prediction error, contrary to rate codes in which spike times are considered to be random samples of an underlying firing rate. As a consequence of this coding scheme, a multitude of spike patterns can reliably encode the same information. This results in weakly correlated, Poisson-like spike trains that are sensitive to initial conditions but robust to even high levels of external neural noise. This spike train variability reproduces the one observed in cortical sensory spike trains, but cannot be equated to noise. On the contrary, it is a consequence of optimal spike-based inference. In contrast, we show that rate-based models perform poorly when implemented with stochastically spiking neurons.

  4. Visual and intelligent transients and accidents analyzer based on thermal-hydraulic system code

    International Nuclear Information System (INIS)

    Meng Lin; Rui Hu; Yun Su; Ronghua Zhang; Yanhua Yang

    2005-01-01

    Full text of publication follows: Many thermal-hydraulic system codes were developed in the past twenty years, such as RELAP5, RETRAN, ATHLET, etc. Because of their general and advanced features in thermal-hydraulic computation, they are widely used in the world to analyze transients and accidents. But there are following disadvantages for most of these original thermal-hydraulic system codes. Firstly, because models are built through input decks, so the input files are complex and non-figurative, and the style of input decks is various for different users and models. Secondly, results are shown in off-line data file form. It is not convenient for analysts who may pay more attention to dynamic parameters trend and changing. Thirdly, there are few interfaces with other program in these original thermal-hydraulic system codes. This restricts the codes expanding. The subject of this paper is to develop a powerful analyzer based on these thermal-hydraulic system codes to analyze transients and accidents more simply, accurately and fleetly. Firstly, modeling is visual and intelligent. Users build the thermalhydraulic system model using component objects according to their needs, and it is not necessary for them to face bald input decks. The style of input decks created automatically by the analyzer is unified and can be accepted easily by other people. Secondly, parameters concerned by analyst can be dynamically communicated to show or even change. Thirdly, the analyzer provide interface with other programs for the thermal-hydraulic system code. Thus parallel computation between thermal-hydraulic system code and other programs become possible. In conclusion, through visual and intelligent method, the analyzer based on general and advanced thermal-hydraulic system codes can be used to analysis transients and accidents more effectively. The main purpose of this paper is to present developmental activities, assessment and application results of the visual and intelligent

  5. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    OpenAIRE

    Wang, Lei; Wang, Qing

    2017-01-01

    In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D) communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the netw...

  6. Enhancing pediatric safety: assessing and improving resident competency in life-threatening events with a computer-based interactive resuscitation tool

    International Nuclear Information System (INIS)

    Lerner, Catherine; Gaca, Ana M.; Frush, Donald P.; Ancarana, Anjanette; Hohenhaus, Sue; Seelinger, Terry A.; Frush, Karen

    2009-01-01

    Though rare, allergic reactions occur as a result of administration of low osmolality nonionic iodinated contrast material to pediatric patients. Currently available resuscitation aids are inadequate in guiding radiologists' initial management of such reactions. To compare radiology resident competency with and without a computer-based interactive resuscitation tool in the management of life-threatening events in pediatric patients. The study was approved by the IRB. Radiology residents (n=19; 14 male, 5 female; 19 certified in basic life support/advanced cardiac life support; 1 certified in pediatric advanced life support) were videotaped during two simulated 5-min anaphylaxis scenarios involving 18-month-old and 8-year-old mannequins (order randomized). No advance warning was given. In half of the scenarios, a computer-based interactive resuscitation tool with a response-driven decision tree was available to residents (order randomized). Competency measures included: calling a code, administering oxygen and epinephrine, and correctly dosing epinephrine. Residents performed significantly more essential interventions with the computer-based resuscitation tool than without (72/76 vs. 49/76, P<0.001). Significantly more residents appropriately dosed epinephrine with the tool than without (17/19 vs. 1/19; P<0.001). More residents called a code with the tool than without (17/19 vs. 14/19; P = 0.08). A learning effect was present: average times to call a code, request oxygen, and administer epinephrine were shorter in the second scenario (129 vs. 93 s, P=0.24; 52 vs. 30 s, P<0.001; 152 vs. 82 s, P=0.025, respectively). All the trainees found the resuscitation tool helpful and potentially useful in a true pediatric emergency. A computer-based interactive resuscitation tool significantly improved resident performance in managing pediatric emergencies in the radiology department. (orig.)

  7. DATA-POOL : a direct-access data base for large-scale nuclear codes

    International Nuclear Information System (INIS)

    Yamano, Naoki; Koyama, Kinji; Naito, Yoshitaka; Minami, Kazuyoshi.

    1991-12-01

    A direct-access data base DATA-POOL has been developed for large-scale nuclear codes. The data can be stored and retrieved with specifications of simple node names, by using the DATA-POOL access package written in the FORTRAN 77 language. A management utility POOL for the DATA-POOL is also provided. A typical application of the DATA-POOL is shown to the RADHEAT-V4 code system developed for performing safety analyses of radiation shielding. Many samples and error messages are also noted to apply the DATA-POOL for the other code systems. This report is provided for a manual of DATA-POOL. (author)

  8. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  9. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  10. Error Concealment using Neural Networks for Block-Based Image Coding

    Directory of Open Access Journals (Sweden)

    M. Mokos

    2006-06-01

    Full Text Available In this paper, a novel adaptive error concealment (EC algorithm, which lowers the requirements for channel coding, is proposed. It conceals errors in block-based image coding systems by using neural network. In this proposed algorithm, only the intra-frame information is used for reconstruction of the image with separated damaged blocks. The information of pixels surrounding a damaged block is used to recover the errors using the neural network models. Computer simulation results show that the visual quality and the MSE evaluation of a reconstructed image are significantly improved using the proposed EC algorithm. We propose also a simple non-neural approach for comparison.

  11. File compression and encryption based on LLS and arithmetic coding

    Science.gov (United States)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.

  12. ACTINIDE AND ULTRA-HEAVY ABUNDANCES IN THE LOCAL GALACTIC COSMIC RAYS: AN ANALYSIS OF THE RESULTS FROM THE LDEF ULTRA-HEAVY COSMIC-RAY EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Donnelly, J. [Dublin Institute of Technology (DIT), School of Physics, Kevin Street, Dublin 8 (Ireland); Thompson, A.; O' Sullivan, D.; Daly, J.; Drury, L. [School of Cosmic Physics, Dublin Institute for Advanced Studies, 31 Fitzwilliam Place, Dublin 2 (Ireland); Domingo, V.; Wenzel, K.-P. [European Space Research and Technology Centre (ESTEC), Keplerlaan 1, Postbus 299, 2200 AG Noordwijk (Netherlands)

    2012-03-01

    The LDEF Ultra-Heavy Cosmic-Ray Experiment (UHCRE) detected Galactic cosmic rays (GCRs) of charge Z {>=} 70 in Earth orbit with an exposure factor of 170 m{sup 2} sr yr, much larger than any other experiment. The major results include the first statistically significant uniform sample of GCR actinides with 35 events passing quality cuts, evidence for the existence of transuranic nuclei in the GCR with one {sub 96}Cm candidate event, and a low {sub 82}Pb/{sub 78}Pt ratio consistent with other experiments. The probability of the existence of a transuranic component is estimated as 96%, while the most likely {sub 92}U/{sub 90}Th ratio is found to be 0.4 within a wide 70% confidence interval ranging from 0 to 0.96. Overall, the results are consistent with a volatility-based acceleration bias and source material which is mainly ordinary interstellar medium material with some recent contamination by freshly synthesized material. Uncertainty in the key {sub 92}U/{sub 90}Th ratio is dominated by statistical errors resulting from the small sample size and any improved determination will thus require an experiment with a substantially larger exposure factor than the UHCRE.

  13. Block-based wavelet transform coding of mammograms with region-adaptive quantization

    Science.gov (United States)

    Moon, Nam Su; Song, Jun S.; Kwon, Musik; Kim, JongHyo; Lee, ChoongWoong

    1998-06-01

    To achieve both high compression ratio and information preserving, it is an efficient way to combine segmentation and lossy compression scheme. Microcalcification in mammogram is one of the most significant sign of early stage of breast cancer. Therefore in coding, detection and segmentation of microcalcification enable us to preserve it well by allocating more bits to it than to other regions. Segmentation of microcalcification is performed both in spatial domain and in wavelet transform domain. Peak error controllable quantization step, which is off-line designed, is suitable for medical image compression. For region-adaptive quantization, block- based wavelet transform coding is adopted and different peak- error-constrained quantizers are applied to blocks according to the segmentation result. In view of preservation of microcalcification, the proposed coding scheme shows better performance than JPEG.

  14. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    Energy Technology Data Exchange (ETDEWEB)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay gamma-quanta by the residuals in the activated structures and scoring the prompt doses of these gamma-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and showed a good agreement. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  15. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    Science.gov (United States)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  16. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  17. Responding to the Effects of Extreme Heat: Baltimore City's Code Red Program.

    Science.gov (United States)

    Martin, Jennifer L

    2016-01-01

    Heat response plans are becoming increasingly more common as US cities prepare for heat waves and other effects of climate change. Standard elements of heat response plans exist, but plans vary depending on geographic location and distribution of vulnerable populations. Because heat events vary over time and affect populations differently based on vulnerability, it is difficult to compare heat response plans and evaluate responses to heat events. This article provides an overview of the Baltimore City heat response plan, the Code Red program, and discusses the city's response to the 2012 Ohio Valley/Mid Atlantic Derecho, a complex heat event. Challenges with and strategies for evaluating the program are reviewed and shared.

  18. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  19. A UML profile for code generation of component based distributed systems

    International Nuclear Information System (INIS)

    Chiozzi, G.; Karban, R.; Andolfato, L.; Tejeda, A.

    2012-01-01

    A consistent and unambiguous implementation of code generation (model to text transformation) from UML (must rely on a well defined UML (Unified Modelling Language) profile, customizing UML for a particular application domain. Such a profile must have a solid foundation in a formally correct ontology, formalizing the concepts and their relations in the specific domain, in order to avoid a maze or set of wildly created stereotypes. The paper describes a generic profile for the code generation of component based distributed systems for control applications, the process to distill the ontology and define the profile, and the strategy followed to implement the code generator. The main steps that take place iteratively include: defining the terms and relations with an ontology, mapping the ontology to the appropriate UML meta-classes, testing the profile by creating modelling examples, and generating the code. This has allowed us to work on the modelling of E-ELT (European Extremely Large Telescope) control system and instrumentation without knowing what infrastructure will be finally used

  20. Context-based coding of bilevel images enhanced by digital straight line analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    , or segmentation maps are also encoded efficiently. The algorithm is not targeted at document images with text, which can be coded efficiently with dictionary-based techniques as in JBIG2. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used...... in the context definition for arithmetic encoding. Tested on individual images of standard TV resolution binary shapes and the binary layers of a digital map, the proposed algorithm outperforms PWC, JBIG, JBIG2, and MPEG-4 CAE. On the binary shapes, the code lengths are reduced by 21%, 27 %, 28 %, and 41...

  1. Review of Rateless-Network-Coding-Based Packet Protection in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    A. S. Abdullah

    2015-01-01

    Full Text Available In recent times, there have been many developments in wireless sensor network (WSN technologies using coding theory. Fast and efficient protection schemes for data transfer over the WSN are some of the issues in coding theory. This paper reviews the issues related to the application of the joint rateless-network coding (RNC within the WSN in the context of packet protection. The RNC is a method in which any node in the network is allowed to encode and decode the transmitted data in order to construct a robust network, improve network throughput, and decrease delays. To the best of our knowledge, there has been no comprehensive discussion about RNC. To begin with, this paper briefly describes the concept of packet protection using network coding and rateless codes. We therefore discuss the applications of RNC for improving the capability of packet protection. Several works related to this issue are discussed. Finally, the paper concludes that the RNC-based packet protection scheme is able to improve the packet reception rate and suggests future studies to enhance the capability of RNC protection.

  2. Biases in detection of apparent “weekend effect” on outcome with administrative coding data: population based study of stroke

    Science.gov (United States)

    Li, Linxin

    2016-01-01

    Objectives To determine the accuracy of coding of admissions for stroke on weekdays versus weekends and any impact on apparent outcome. Design Prospective population based stroke incidence study and a scoping review of previous studies of weekend effects in stroke. Setting Primary and secondary care of all individuals registered with nine general practices in Oxfordshire, United Kingdom (OXVASC, the Oxford Vascular Study). Participants All patients with clinically confirmed acute stroke in OXVASC identified with multiple overlapping methods of ascertainment in 2002-14 versus all acute stroke admissions identified by hospital diagnostic and mortality coding alone during the same period. Main outcomes measures Accuracy of administrative coding data for all patients with confirmed stroke admitted to hospital in OXVASC. Difference between rates of “false positive” or “false negative” coding for weekday and weekend admissions. Impact of inaccurate coding on apparent case fatality at 30 days in weekday versus weekend admissions. Weekend effects on outcomes in patients with confirmed stroke admitted to hospital in OXVASC and impacts of other potential biases compared with those in the scoping review. Results Among 92 728 study population, 2373 episodes of acute stroke were ascertained in OXVASC, of which 826 (34.8%) mainly minor events were managed without hospital admission, 60 (2.5%) occurred out of the area or abroad, and 195 (8.2%) occurred in hospital during an admission for a different reason. Of 1292 local hospital admissions for acute stroke, 973 (75.3%) were correctly identified by administrative coding. There was no bias in distribution of weekend versus weekday admission of the 319 strokes missed by coding. Of 1693 admissions for stroke identified by coding, 1055 (62.3%) were confirmed to be acute strokes after case adjudication. Among the 638 false positive coded cases, patients were more likely to be admitted on weekdays than at weekends (536

  3. Sequence-based heuristics for faster annotation of non-coding RNA families.

    Science.gov (United States)

    Weinberg, Zasha; Ruzzo, Walter L

    2006-01-01

    Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.

  4. Dataset for petroleum based stock markets and GAUSS codes for SAMEM

    Directory of Open Access Journals (Sweden)

    Ahmed A.A. Khalifa

    2017-02-01

    Full Text Available This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday for oil and natural gas volatility and the oil rich economies’ stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006–July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.

  5. Dataset for petroleum based stock markets and GAUSS codes for SAMEM.

    Science.gov (United States)

    Khalifa, Ahmed A A; Bertuccelli, Pietro; Otranto, Edoardo

    2017-02-01

    This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday) for oil and natural gas volatility and the oil rich economies' stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006-July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM) with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.

  6. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  7. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  8. Energy-Efficient Cluster Based Routing Protocol in Mobile Ad Hoc Networks Using Network Coding

    OpenAIRE

    Srinivas Kanakala; Venugopal Reddy Ananthula; Prashanthi Vempaty

    2014-01-01

    In mobile ad hoc networks, all nodes are energy constrained. In such situations, it is important to reduce energy consumption. In this paper, we consider the issues of energy efficient communication in MANETs using network coding. Network coding is an effective method to improve the performance of wireless networks. COPE protocol implements network coding concept to reduce number of transmissions by mixing the packets at intermediate nodes. We incorporate COPE into cluster based routing proto...

  9. The design of the CMOS wireless bar code scanner applying optical system based on ZigBee

    Science.gov (United States)

    Chen, Yuelin; Peng, Jian

    2008-03-01

    The traditional bar code scanner is influenced by the length of data line, but the farthest distance of the wireless bar code scanner of wireless communication is generally between 30m and 100m on the market. By rebuilding the traditional CCD optical bar code scanner, a CMOS code scanner is designed based on the ZigBee to meet the demands of market. The scan system consists of the CMOS image sensor and embedded chip S3C2401X, when the two dimensional bar code is read, the results show the inaccurate and wrong code bar, resulted from image defile, disturber, reads image condition badness, signal interference, unstable system voltage. So we put forward the method which uses the matrix evaluation and Read-Solomon arithmetic to solve them. In order to construct the whole wireless optics of bar code system and to ensure its ability of transmitting bar code image signals digitally with long distances, ZigBee is used to transmit data to the base station, and this module is designed based on image acquisition system, and at last the wireless transmitting/receiving CC2430 module circuit linking chart is established. And by transplanting the embedded RTOS system LINUX to the MCU, an applying wireless CMOS optics bar code scanner and multi-task system is constructed. Finally, performance of communication is tested by evaluation software Smart RF. In broad space, every ZIGBEE node can realize 50m transmission with high reliability. When adding more ZigBee nodes, the transmission distance can be several thousands of meters long.

  10. On Objects and Events

    DEFF Research Database (Denmark)

    Eugster, Patrick Thomas; Guerraoui, Rachid; Damm, Christian Heide

    2001-01-01

    This paper presents linguistic primitives for publish/subscribe programming using events and objects. We integrate our primitives into a strongly typed object-oriented language through four mechanisms: (1) serialization, (2) multiple sub typing, (3) closures, and (4) deferred code evaluation. We...

  11. Optical information encryption based on incoherent superposition with the help of the QR code

    Science.gov (United States)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  12. Automated Testing with Targeted Event Sequence Generation

    DEFF Research Database (Denmark)

    Jensen, Casper Svenning; Prasad, Mukul R.; Møller, Anders

    2013-01-01

    Automated software testing aims to detect errors by producing test inputs that cover as much of the application source code as possible. Applications for mobile devices are typically event-driven, which raises the challenge of automatically producing event sequences that result in high coverage...

  13. Introduction into scientific work methods-a necessity when performance-based codes are introduced

    DEFF Research Database (Denmark)

    Dederichs, Anne; Sørensen, Lars Schiøtt

    The introduction of performance-based codes in Denmark in 2004 requires new competences from people working with different aspects of fire safety in the industry and the public sector. This abstract presents an attempt in reducing problems with handling and analysing the mathematical methods...... and CFD models when applying performance-based codes. This is done within the educational program "Master of Fire Safety Engineering" at the department of Civil Engineering at the Technical University of Denmark. It was found that the students had general problems with academic methods. Therefore, a new...

  14. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  15. A Brain Computer Interface for Robust Wheelchair Control Application Based on Pseudorandom Code Modulated Visual Evoked Potential

    DEFF Research Database (Denmark)

    Mohebbi, Ali; Engelsholm, Signe K.D.; Puthusserypady, Sadasivan

    2015-01-01

    In this pilot study, a novel and minimalistic Brain Computer Interface (BCI) based wheelchair control application was developed. The system was based on pseudorandom code modulated Visual Evoked Potentials (c-VEPs). The visual stimuli in the scheme were generated based on the Gold code...

  16. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  17. Weighted-Bit-Flipping-Based Sequential Scheduling Decoding Algorithms for LDPC Codes

    Directory of Open Access Journals (Sweden)

    Qing Zhu

    2013-01-01

    Full Text Available Low-density parity-check (LDPC codes can be applied in a lot of different scenarios such as video broadcasting and satellite communications. LDPC codes are commonly decoded by an iterative algorithm called belief propagation (BP over the corresponding Tanner graph. The original BP updates all the variable-nodes simultaneously, followed by all the check-nodes simultaneously as well. We propose a sequential scheduling algorithm based on weighted bit-flipping (WBF algorithm for the sake of improving the convergence speed. Notoriously, WBF is a low-complexity and simple algorithm. We combine it with BP to obtain advantages of these two algorithms. Flipping function used in WBF is borrowed to determine the priority of scheduling. Simulation results show that it can provide a good tradeoff between FER performance and computation complexity for short-length LDPC codes.

  18. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  19. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  20. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  1. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  2. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  3. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  4. Compact Hilbert Curve Index Algorithm Based on Gray Code

    Directory of Open Access Journals (Sweden)

    CAO Xuefeng

    2016-12-01

    Full Text Available Hilbert curve has best clustering in various kinds of space filling curves, and has been used as an important tools in discrete global grid spatial index design field. But there are lots of redundancies in the standard Hilbert curve index when the data set has large differences between dimensions. In this paper, the construction features of Hilbert curve is analyzed based on Gray code, and then the compact Hilbert curve index algorithm is put forward, in which the redundancy problem has been avoided while Hilbert curve clustering preserved. Finally, experiment results shows that the compact Hilbert curve index outperforms the standard Hilbert index, their 1 computational complexity is nearly equivalent, but the real data set test shows the coding time and storage space decrease 40%, the speedup ratio of sorting speed is nearly 4.3.

  5. Safety in nuclear power plant siting. A code of practice

    International Nuclear Information System (INIS)

    1978-01-01

    This publication is brought out within the framework of establishing Codes of Practice and Safety Guides for nuclear power plants: NUSS programme. The scope of the document encompasses site and site-plant interaction factors related to operational states and accident conditions. The purpose of the Code is to give criteria and procedures to be applied as appropriate to operational states and accident conditions, including those which could lead to emergency situations. This Code is mainly concerned with severe events of low probability which relate to the siting of nuclear power plants and have to be considered in designing a particular nuclear power plant. Annex: Examples of natural and man-made events relevant for design basis evaluation

  6. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  7. What Are They Talking About? Analyzing Code Reviews in Pull-Based Development Model

    Institute of Scientific and Technical Information of China (English)

    Zhi-Xing Li; Yue Yu; Gang Yin; Tao Wang; Huai-Min Wang

    2017-01-01

    Code reviews in pull-based model are open to community users on GitHub. Various participants are taking part in the review discussions and the review topics are not only about the improvement of code contributions but also about project evolution and social interaction. A comprehensive understanding of the review topics in pull-based model would be useful to better organize the code review process and optimize review tasks such as reviewer recommendation and pull-request prioritization. In this paper, we first conduct a qualitative study on three popular open-source software projects hosted on GitHub and construct a fine-grained two-level taxonomy covering four level-1 categories (code correctness, pull-request decision-making, project management, and social interaction) and 11 level-2 subcategories (e.g., defect detecting, reviewer assigning, contribution encouraging). Second, we conduct preliminary quantitative analysis on a large set of review comments that were labeled by TSHC (a two-stage hybrid classification algorithm), which is able to automatically classify review comments by combining rule-based and machine-learning techniques. Through the quantitative study, we explore the typical review patterns. We find that the three projects present similar comments distribution on each subcategory. Pull-requests submitted by inexperienced contributors tend to contain potential issues even though they have passed the tests. Furthermore, external contributors are more likely to break project conventions in their early contributions.

  8. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  9. Fire-safety engineering and performance-based codes

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    project administrators, etc. The book deals with the following topics: • Historical presentation on the subject of fire • Legislation and building project administration • European fire standardization • Passive and active fire protection • Performance-based Codes • Fire-safety Engineering • Fundamental......Fire-safety Engineering is written as a textbook for Engineering students at universities and other institutions of higher education that teach in the area of fire. The book can also be used as a work of reference for consulting engineers, Building product manufacturers, contractors, building...... thermodynamics • Heat exchange during the fire process • Skin burns • Burning rate, energy release rate and design fires • Proposal to Risk-based design fires • Proposal to a Fire scale • Material ignition and flame spread • Fire dynamics in buildings • Combustion products and toxic gases • Smoke inhalation...

  10. [QR-Code based patient tracking: a cost-effective option to improve patient safety].

    Science.gov (United States)

    Fischer, M; Rybitskiy, D; Strauß, G; Dietz, A; Dressler, C R

    2013-03-01

    Hospitals are implementing a risk management system to avoid patient or surgery mix-ups. The trend is to use preoperative checklists. This work deals specifically with a type of patient identification, which is realized by storing patient data on a patient-fixed medium. In 127 ENT surgeries data relevant for patient identification were encrypted in a 2D-QR-Code. The code, as a separate document coming with the patient chart or as a patient wristband, has been decrypted in the OR and the patient data were presented visible for all persons. The decoding time, the compliance of the patient data, as well as the duration of the patient identification was compared with the traditional patient identification by inspection of the patient chart. A total of 125 QR codes were read. The time for the decrypting of QR-Code was 5.6 s, the time for the screen view for patient identification was 7.9 s, and for a comparison group of 75 operations traditional patient identification was 27.3 s. Overall, there were 6 relevant information errors in the two parts of the experiment. This represents a ratio of 0.6% for 8 relevant classes per each encrypted QR code. This work allows a cost effective way to technically support patient identification based on electronic patient data. It was shown that the use in the clinical routine is possible. The disadvantage is a potential misinformation from incorrect or missing information in the HIS, or due to changes of the data after the code was created. The QR-code-based patient tracking is seen as a useful complement to the already widely used identification wristband. © Georg Thieme Verlag KG Stuttgart · New York.

  11. A novel QC-LDPC code based on the finite field multiplicative group for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen

    2013-09-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.

  12. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  13. The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose

    Science.gov (United States)

    Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.

    2018-01-01

    The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.

  14. Depth Measurement Based on Infrared Coded Structured Light

    Directory of Open Access Journals (Sweden)

    Tong Jia

    2014-01-01

    Full Text Available Depth measurement is a challenging problem in computer vision research. In this study, we first design a new grid pattern and develop a sequence coding and decoding algorithm to process the pattern. Second, we propose a linear fitting algorithm to derive the linear relationship between the object depth and pixel shift. Third, we obtain depth information on an object based on this linear relationship. Moreover, 3D reconstruction is implemented based on Delaunay triangulation algorithm. Finally, we utilize the regularity of the error curves to correct the system errors and improve the measurement accuracy. The experimental results show that the accuracy of depth measurement is related to the step length of moving object.

  15. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  16. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  17. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  18. Code Lavender: Cultivating Intentional Acts of Kindness in Response to Stressful Work Situations.

    Science.gov (United States)

    Davidson, Judy E; Graham, Patricia; Montross-Thomas, Lori; Norcross, William; Zerbi, Giovanna

    Providing healthcare can be stressful. Gone unchecked, clinicians may experience decreased compassion, and increased burnout or secondary traumatic stress. Code Lavender is designed to increase acts of kindness after stressful workplace events occur. To test the feasibility of providing Code Lavender. After stressful events in the workplace, staff will provide, receive, and recommend Code Lavender to others. The provision of Code Lavender will improve Professional Quality of Life Scale (ProQoL) scores, general job satisfaction, and feeling cared for in the workplace. Pilot program testing and evaluation. Staff and physicians on four hospital units were informed of the Code Lavender kit availability, which includes words of comfort, chocolate, lavender essential oil, and employee health referral information. Feasibility data and ProQoL scores were collected at baseline and three months. At baseline, 48% (n = 164) reported a stressful event at work in the last three months. Post-intervention, 51% reported experiencing a stressful workplace event, with 32% receiving a Code Lavender kit from their co-workers as a result (n = 83). Of those who received the Code Lavender intervention; 100% found it helpful, and 84% would recommend it to others. No significant changes were demonstrated before and after the intervention in ProQoL scores or job satisfaction, however the emotion of feeling cared-for improved. Results warrant continuation and further dissemination of Code Lavender. Investigators have received requests to expand the program implying positive reception of the intervention. Additional interventions are needed to overcome workplace stressors. A more intense peer support program is being tested. Copyright © 2017. Published by Elsevier Inc.

  19. A Radiation Chemistry Code Based on the Green's Function of the Diffusion Equation

    Science.gov (United States)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Stochastic radiation track structure codes are of great interest for space radiation studies and hadron therapy in medicine. These codes are used for a many purposes, notably for microdosimetry and DNA damage studies. In the last two decades, they were also used with the Independent Reaction Times (IRT) method in the simulation of chemical reactions, to calculate the yield of various radiolytic species produced during the radiolysis of water and in chemical dosimeters. Recently, we have developed a Green's function based code to simulate reversible chemical reactions with an intermediate state, which yielded results in excellent agreement with those obtained by using the IRT method. This code was also used to simulate and the interaction of particles with membrane receptors. We are in the process of including this program for use with the Monte-Carlo track structure code Relativistic Ion Tracks (RITRACKS). This recent addition should greatly expand the capabilities of RITRACKS, notably to simulate DNA damage by both the direct and indirect effect.

  20. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  1. Improvement of Level-1 PSA computer code package -A study for nuclear safety improvement-

    International Nuclear Information System (INIS)

    Park, Chang Kyu; Kim, Tae Woon; Ha, Jae Joo; Han, Sang Hoon; Cho, Yeong Kyun; Jeong, Won Dae; Jang, Seung Cheol; Choi, Young; Seong, Tae Yong; Kang, Dae Il; Hwang, Mi Jeong; Choi, Seon Yeong; An, Kwang Il

    1994-07-01

    This year is the second year of the Government-sponsored Mid- and Long-Term Nuclear Power Technology Development Project. The scope of this subproject titled on 'The Improvement of Level-1 PSA Computer Codes' is divided into three main activities : (1) Methodology development on the under-developed fields such as risk assessment technology for plant shutdown and external events, (2) Computer code package development for Level-1 PSA, (3) Applications of new technologies to reactor safety assessment. At first, in the area of PSA methodology development, foreign PSA reports on shutdown and external events have been reviewed and various PSA methodologies have been compared. Level-1 PSA code KIRAP and CCF analysis code COCOA are converted from KOS to Windows. Human reliability database has been also established in this year. In the area of new technology applications, fuzzy set theory and entropy theory are used to estimate component life and to develop a new measure of uncertainty importance. Finally, in the field of application study of PSA technique to reactor regulation, a strategic study to develop a dynamic risk management tool PEPSI and the determination of inspection and test priority of motor operated valves based on risk importance worths have been studied. (Author)

  2. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    Energy Technology Data Exchange (ETDEWEB)

    Giovedi, Claudia; Martins, Marcelo Ramos, E-mail: claudia.giovedi@labrisco.usp.br, E-mail: mrmartin@usp.br [Laboratorio de Analise, Avaliacao e Gerenciamento de Risco (LabRisco/POLI/USP), São Paulo, SP (Brazil); Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e, E-mail: ayabe@ipen.br, E-mail: dsgomes@ipen.br, E-mail: teixiera@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  3. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    International Nuclear Information System (INIS)

    Giovedi, Claudia; Martins, Marcelo Ramos; Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e

    2017-01-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  4. Using WIRED to study Simulated Linear Collider Detector Events

    Energy Technology Data Exchange (ETDEWEB)

    George, A

    2004-02-05

    The purpose of this project is to enhance the properties of the LCD WIRED Event Display. By extending the functionality of the display, physicists will be able to view events with more detail and interpret data faster. Poor characteristics associated with WIRED can severely affect the way we understand events, but by bringing attention to specific attributes we open doors to new ideas. Events displayed inside of the LCD have many different properties; this is why scientists need to be able to distinguish data using a plethora of symbols and other graphics. This paper will explain how we can view events differently using clustering and displaying results with track finding. Different source codes extracted from HEP libraries will be analyzed and tested to see which codes display the information needed. It is clear that, through these changes certain aspects of WIRED will be recognized more often allowing good event display which lead to better physics results.

  5. Tuning iteration space slicing based tiled multi-core code implementing Nussinov's RNA folding.

    Science.gov (United States)

    Palkowski, Marek; Bielecki, Wlodzimierz

    2018-01-15

    RNA folding is an ongoing compute-intensive task of bioinformatics. Parallelization and improving code locality for this kind of algorithms is one of the most relevant areas in computational biology. Fortunately, RNA secondary structure approaches, such as Nussinov's recurrence, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. This allows us to apply powerful polyhedral compilation techniques based on the transitive closure of dependence graphs to generate parallel tiled code implementing Nussinov's RNA folding. Such techniques are within the iteration space slicing framework - the transitive dependences are applied to the statement instances of interest to produce valid tiles. The main problem at generating parallel tiled code is defining a proper tile size and tile dimension which impact parallelism degree and code locality. To choose the best tile size and tile dimension, we first construct parallel parametric tiled code (parameters are variables defining tile size). With this purpose, we first generate two nonparametric tiled codes with different fixed tile sizes but with the same code structure and then derive a general affine model, which describes all integer factors available in expressions of those codes. Using this model and known integer factors present in the mentioned expressions (they define the left-hand side of the model), we find unknown integers in this model for each integer factor available in the same fixed tiled code position and replace in this code expressions, including integer factors, with those including parameters. Then we use this parallel parametric tiled code to implement the well-known tile size selection (TSS) technique, which allows us to discover in a given search space the best tile size and tile dimension maximizing target code performance. For a given search space, the presented approach allows us to choose the best tile size and tile dimension in

  6. GRay: A MASSIVELY PARALLEL GPU-BASED CODE FOR RAY TRACING IN RELATIVISTIC SPACETIMES

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal [Department of Astronomy, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States)

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  7. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  8. Materials and design bases issues in ASME Code Case N-47

    International Nuclear Information System (INIS)

    Huddleston, R.L.; Swindeman, R.W.

    1993-04-01

    A preliminary evaluation of the design bases (principally ASME Code Case N-47) was conducted for design and operation of reactors at elevated temperatures where the time-dependent effects of creep, creep-fatigue, and creep ratcheting are significant. Areas where Code rules or regulatory guides may be lacking or inadequate to ensure the operation over the expected life cycles for the next-generation advanced high-temperature reactor systems, with designs to be certified by the US Nuclear Regulatory Commission, have been identified as unresolved issues. Twenty-two unresolved issues were identified and brief scoping plans developed for resolving these issues

  9. Wavelet transform and Huffman coding based electrocardiogram compression algorithm: Application to telecardiology

    International Nuclear Information System (INIS)

    Chouakri, S A; Djaafri, O; Taleb-Ahmed, A

    2013-01-01

    We present in this work an algorithm for electrocardiogram (ECG) signal compression aimed to its transmission via telecommunication channel. Basically, the proposed ECG compression algorithm is articulated on the use of wavelet transform, leading to low/high frequency components separation, high order statistics based thresholding, using level adjusted kurtosis value, to denoise the ECG signal, and next a linear predictive coding filter is applied to the wavelet coefficients producing a lower variance signal. This latter one will be coded using the Huffman encoding yielding an optimal coding length in terms of average value of bits per sample. At the receiver end point, with the assumption of an ideal communication channel, the inverse processes are carried out namely the Huffman decoding, inverse linear predictive coding filter and inverse discrete wavelet transform leading to the estimated version of the ECG signal. The proposed ECG compression algorithm is tested upon a set of ECG records extracted from the MIT-BIH Arrhythmia Data Base including different cardiac anomalies as well as the normal ECG signal. The obtained results are evaluated in terms of compression ratio and mean square error which are, respectively, around 1:8 and 7%. Besides the numerical evaluation, the visual perception demonstrates the high quality of ECG signal restitution where the different ECG waves are recovered correctly

  10. Validation of the CATHARE2 code against experimental data from Brayton-cycle plants

    International Nuclear Information System (INIS)

    Bentivoglio, Fabrice; Tauveron, Nicolas; Geffraye, Genevieve; Gentner, Herve

    2008-01-01

    In recent years the Commissariat a l'Energie Atomique (CEA) has commissioned a wide range of feasibility studies of future-advanced nuclear reactors, in particular gas-cooled reactors (GCR). The thermohydraulic behaviour of these systems is a key issue for, among other things, the design of the core, the assessment of thermal stresses, and the design of decay heat removal systems. These studies therefore require efficient and reliable simulation tools capable of modelling the whole reactor, including the core, the core vessel, piping, heat exchangers and turbo-machinery. CATHARE2 is a thermal-hydraulic 1D reference safety code developed and extensively validated for the French pressurized water reactors. It has been recently adapted to deal also with gas-cooled reactor applications. In order to validate CATHARE2 for these new applications, CEA has initiated an ambitious long-term experimental program. The foreseen experimental facilities range from small-scale loops for physical correlations, to component technology and system demonstration loops. In the short-term perspective, CATHARE2 is being validated against existing experimental data. And in particular from the German power plants Oberhausen I and II. These facilities have both been operated by the German utility Energie Versorgung Oberhausen (E.V.O.) and their power conversion systems resemble to the high-temperature reactor concepts: Oberhausen I is a 13.75-MWe Brayton-cycle air turbine plant, and Oberhausen II is a 50-MWe Brayton-cycle helium turbine plant. The paper presents these two plants, the adopted CATHARE2 modelling and a comparison between experimental data and code results for both steady state and transient cases

  11. Monte-Carlo code PARJET to simulate e+e--annihilation events via QCD jets

    International Nuclear Information System (INIS)

    Ritter, S.

    1983-01-01

    The Monte-Carlo code PARJET simulates exclusive hadronic final states produced in e + e - -annihilation via a virtual photon by two steps: (i) the fragmentation of the original quark-antiquark pair into further partons using results of perturbative QCD in the leading logarithmic approximation (LLA), and (ii) the transition of these parton jets into hadrons on the basis of a chain decay model. Program summary and code description are given. (author)

  12. Low-Complexity Multiple Description Coding of Video Based on 3D Block Transforms

    Directory of Open Access Journals (Sweden)

    Andrey Norkin

    2007-02-01

    Full Text Available The paper presents a multiple description (MD video coder based on three-dimensional (3D transforms. Two balanced descriptions are created from a video sequence. In the encoder, video sequence is represented in a form of coarse sequence approximation (shaper included in both descriptions and residual sequence (details which is split between two descriptions. The shaper is obtained by block-wise pruned 3D-DCT. The residual sequence is coded by 3D-DCT or hybrid, LOT+DCT, 3D-transform. The coding scheme is targeted to mobile devices. It has low computational complexity and improved robustness of transmission over unreliable networks. The coder is able to work at very low redundancies. The coding scheme is simple, yet it outperforms some MD coders based on motion-compensated prediction, especially in the low-redundancy region. The margin is up to 3 dB for reconstruction from one description.

  13. A lossless multichannel bio-signal compression based on low-complexity joint coding scheme for portable medical devices.

    Science.gov (United States)

    Kim, Dong-Sun; Kwon, Jin-San

    2014-09-18

    Research on real-time health systems have received great attention during recent years and the needs of high-quality personal multichannel medical signal compression for personal medical product applications are increasing. The international MPEG-4 audio lossless coding (ALS) standard supports a joint channel-coding scheme for improving compression performance of multichannel signals and it is very efficient compression method for multi-channel biosignals. However, the computational complexity of such a multichannel coding scheme is significantly greater than that of other lossless audio encoders. In this paper, we present a multichannel hardware encoder based on a low-complexity joint-coding technique and shared multiplier scheme for portable devices. A joint-coding decision method and a reference channel selection scheme are modified for a low-complexity joint coder. The proposed joint coding decision method determines the optimized joint-coding operation based on the relationship between the cross correlation of residual signals and the compression ratio. The reference channel selection is designed to select a channel for the entropy coding of the joint coding. The hardware encoder operates at a 40 MHz clock frequency and supports two-channel parallel encoding for the multichannel monitoring system. Experimental results show that the compression ratio increases by 0.06%, whereas the computational complexity decreases by 20.72% compared to the MPEG-4 ALS reference software encoder. In addition, the compression ratio increases by about 11.92%, compared to the single channel based bio-signal lossless data compressor.

  14. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  15. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  16. A Framework-Based Environment for Object-Oriented Scientific Codes

    Directory of Open Access Journals (Sweden)

    Robert A. Ballance

    1993-01-01

    Full Text Available Frameworks are reusable object-oriented designs for domain-specific programs. In our estimation, frameworks are the key to productivity and reuse. However, frameworks require increased support from the programming environment. A framework-based environment must include design aides and project browsers that can mediate between the user and the framework. A framework-based approach also places new requirements on conventional tools such as compilers. This article explores the impact of object-oriented frameworks upon a programming environment, in the context of object-oriented finite element and finite difference codes. The role of tools such as design aides and project browsers is discussed, and the impact of a framework-based approach upon compilers is examined. Examples are drawn from our prototype C++ based environment.

  17. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  18. Adaptive bit plane quadtree-based block truncation coding for image compression

    Science.gov (United States)

    Li, Shenda; Wang, Jin; Zhu, Qing

    2018-04-01

    Block truncation coding (BTC) is a fast image compression technique applied in spatial domain. Traditional BTC and its variants mainly focus on reducing computational complexity for low bit rate compression, at the cost of lower quality of decoded images, especially for images with rich texture. To solve this problem, in this paper, a quadtree-based block truncation coding algorithm combined with adaptive bit plane transmission is proposed. First, the direction of edge in each block is detected using Sobel operator. For the block with minimal size, adaptive bit plane is utilized to optimize the BTC, which depends on its MSE loss encoded by absolute moment block truncation coding (AMBTC). Extensive experimental results show that our method gains 0.85 dB PSNR on average compare to some other state-of-the-art BTC variants. So it is desirable for real time image compression applications.

  19. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  20. Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events

    Directory of Open Access Journals (Sweden)

    Jeroen eStekelenburg

    2012-05-01

    Full Text Available In many natural audiovisual events (e.g., a clap of the two hands, the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have already reported that there are distinct neural correlates of temporal (when versus phonetic/semantic (which content on audiovisual integration. Here we examined the effect of visual prediction of auditory location (where in audiovisual biological motion stimuli by varying the spatial congruency between the auditory and visual part of the audiovisual stimulus. Visual stimuli were presented centrally, whereas auditory stimuli were presented either centrally or at 90° azimuth. Typical subadditive amplitude reductions (AV – V < A were found for the auditory N1 and P2 for spatially congruent and incongruent conditions. The new finding is that the N1 suppression was larger for spatially congruent stimuli. A very early audiovisual interaction was also found at 30-50 ms in the spatially congruent condition, while no effect of congruency was found on the suppression of the P2. This indicates that visual prediction of auditory location can be coded very early in auditory processing.

  1. Maritime English as a code-tailored ESP: Genre-based curriculum development as a way out

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2018-04-01

    Full Text Available Maritime English (ME, as a type of English for Specific Purposes (ESP, is somewhat different in that its instruction and research are founded on specific international legal procedures. Thus, it is vital to determine an ESP framework that bridges the code-tailored ME curriculum development with the communicative language teaching approach. This paper reports on the revision of the International Maritime Organization (IMO’s Model Course 3.17, Maritime English, where an integrated genre-based ESP framework helps to achieve the balance between language learning’s “wide-angled” quality and ME’s legal consistency. It is argued that code-tailored ME competences find expressions in maritime domain-specific genres; those are the typical sets of English communicative events that seafarers are involved in while achieving their maritime professional objectives. The curriculum can be designed as to integrate linguistic systems, professional motivation and behaviors, communicative skills and cultural awareness into the teaching process, which entails a process of learning Maritime English while taking maritime domain-specific action. Specifically, the principle of genre as social action apprises the two-stage syllabus mapping, that is, General Maritime English (GME and Specialized Maritime English (SME. In GME, the focus is placed on the linguistic content and how language tasks embedded in the maritime contexts are fulfilled; in SME, the focus is placed on the professional content and how the maritime workplace duties and identities are fulfilled through the English language. As such, syllabus mapping calculates the discursion-profession correlation and helps to ensure that code-tailored ME teaching is communicative performance-oriented. Thus, the multi-syllabus task design and content selection must consistently maintain the genre-based balance on the linguistic-communicative continuum. As a result, the English linguistic systems underlying the

  2. Environmental conditions using thermal-hydraulics computer code GOTHIC for beyond design basis external events

    International Nuclear Information System (INIS)

    Pleskunas, R.J.

    2015-01-01

    In response to the Fukushima Dai-ichi beyond design basis accident in March 2011, the Nuclear Regulatory Commission (NRC) issued Order EA-12-049, 'Issuance of Order to Modify Licenses with Regard to Requirements for Mitigation Strategies Beyond-Design-Basis-External-Events'. To outline the process to be used by individual licensees to define and implement site-specific diverse and flexible mitigation strategies (FLEX) that reduce the risks associated with beyond design basis conditions, Nuclear Energy Institute document NEI 12-06, 'Diverse and Flexible Coping Strategies (FLEX) Implementation Guide', was issued. A beyond design basis external event (BDBEE) is postulated to cause an Extended Loss of AC Power (ELAP), which will result in a loss of ventilation which has the potential to impact room habitability and equipment operability. During the ELAP, portable FLEX equipment will be used to achieve and maintain safe shutdown, and only a minimal set of instruments and controls will be available. Given these circumstances, analysis is required to determine the environmental conditions in several vital areas of the Nuclear Power Plant. The BDBEE mitigating strategies require certain room environments to be maintained such that they can support the occupancy of personnel and the functionality of equipment located therein, which is required to support the strategies associated with compliance to NRC Order EA-12-049. Three thermal-hydraulic analyses of vital areas during an extended loss of AC power using the GOTHIC computer code will be presented: 1) Safety-related pump and instrument room transient analysis; 2) Control Room transient analysis; and 3) Auxiliary/Control Building transient analysis. GOTHIC (Generation of Thermal-Hydraulic Information for Containment) is a general purpose thermal-hydraulics software package for the analysis of nuclear power plant containments, confinement buildings, and system components. It is a volume/path/heat sink

  3. Comparison of LIFE-4 and TEMECH code predictions with TREAT transient test data

    International Nuclear Information System (INIS)

    Gneiting, B.C.; Bard, F.E.; Hunter, C.W.

    1984-09-01

    Transient tests in the TREAT reactor were performed on FFTF Reference design mixed-oxide fuel pins, most of which had received prior steady-state irradiation in the EBR-II reactor. These transient test results provide a data base for calibration and verification of fuel performance codes and for evaluation of processes that affect pin damage during transient events. This paper presents a comparison of the LIFE-4 and TEMECH fuel pin thermal/mechanical analysis codes with the results from 20 HEDL TREAT experiments, ten of which resulted in pin failure. Both the LIFE-4 and TEMECH codes provided an adequate representation of the thermal and mechanical data from the TREAT experiments. Also, a criterion for 50% probability of pin failure was developed for each code using an average cumulative damage fraction value calculated for the pins that failed. Both codes employ the two major cladding loading mechanisms of differential thermal expansion and central cavity pressurization which were demonstrated by the test results. However, a detailed evaluation of the code predictions shows that the two code systems weigh the loading mechanism differently to reach the same end points of the TREAT transient results

  4. A restructuring proposal based on MELCOR for severe accident analysis code development

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sun Hee; Song, Y. M.; Kim, D. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    In order to develop a template based on existing MELCOR code, current data saving and transferring methods used in MELCOR are addressed first. Then a naming convention for the constructed module is suggested and an automatic program to convert old variables into new derived type variables has been developed. Finally, a restructured module for the SPR package has been developed to be applied to MELCOR. The current MELCOR code ensures a fixed-size storage for four different data types, and manages the variable-sized data within the storage limit by storing the data on the stacked packages. It uses pointer to identify the variables between the packages. This technique causes a difficult grasping of the meaning of the variables as well as memory waste. New features of FORTRAN90, however, make it possible to allocate the storage dynamically, and to use the user-defined data type which lead to a restructured module development for the SPR package. An efficient memory treatment and as easy understanding of the code are allowed in this developed module. The validation of the template has been done by comparing the results of the modified code with those from the existing code, and it is confirmed that the results are the same. The template for the SPR package suggested in this report hints the extension of the template to the entire code. It is expected that the template will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models. 3 refs., 15 figs., 16 tabs. (Author)

  5. Lessons from the restructuring of the Danish planning system and its impact on the Greater Copenhagen Region

    DEFF Research Database (Denmark)

    Galland, Daniel

    2013-01-01

    This paper explores the rise and decay of regional planning policies and institutions in the Greater Copenhagen Region (GCR) since the postwar era. The paper develops an understanding based on spatial selectivity and spatial rescaling as regards the fluctuating planning context in the GCR through...

  6. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  7. Development of CAD-Based Geometry Processing Module for a Monte Carlo Particle Transport Analysis Code

    International Nuclear Information System (INIS)

    Choi, Sung Hoon; Kwark, Min Su; Shim, Hyung Jin

    2012-01-01

    As The Monte Carlo (MC) particle transport analysis for a complex system such as research reactor, accelerator, and fusion facility may require accurate modeling of the complicated geometry. Its manual modeling by using the text interface of a MC code to define the geometrical objects is tedious, lengthy and error-prone. This problem can be overcome by taking advantage of modeling capability of the computer aided design (CAD) system. There have been two kinds of approaches to develop MC code systems utilizing the CAD data: the external format conversion and the CAD kernel imbedded MC simulation. The first approach includes several interfacing programs such as McCAD, MCAM, GEOMIT etc. which were developed to automatically convert the CAD data into the MCNP geometry input data. This approach makes the most of the existing MC codes without any modifications, but implies latent data inconsistency due to the difference of the geometry modeling system. In the second approach, a MC code utilizes the CAD data for the direct particle tracking or the conversion to an internal data structure of the constructive solid geometry (CSG) and/or boundary representation (B-rep) modeling with help of a CAD kernel. MCNP-BRL and OiNC have demonstrated their capabilities of the CAD-based MC simulations. Recently we have developed a CAD-based geometry processing module for the MC particle simulation by using the OpenCASCADE (OCC) library. In the developed module, CAD data can be used for the particle tracking through primitive CAD surfaces (hereafter the CAD-based tracking) or the internal conversion to the CSG data structure. In this paper, the performances of the text-based model, the CAD-based tracking, and the internal CSG conversion are compared by using an in-house MC code, McSIM, equipped with the developed CAD-based geometry processing module

  8. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  9. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  10. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory

    Science.gov (United States)

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.

    2017-01-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  11. An in-depth study of sparse codes on abnormality detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2016-01-01

    Sparse representation has been applied successfully in abnormal event detection, in which the baseline is to learn a dictionary accompanied by sparse codes. While much emphasis is put on discriminative dictionary construction, there are no comparative studies of sparse codes regarding abnormality...... are carried out from various angles to better understand the applicability of sparse codes, including computation time, reconstruction error, sparsity, detection accuracy, and their performance combining various detection methods. The experiment results show that combining OMP codes with maximum coordinate...

  12. A joint multi-view plus depth image coding scheme based on 3D-warping

    DEFF Research Database (Denmark)

    Zamarin, Marco; Zanuttigh, Pietro; Milani, Simone

    2011-01-01

    on the scene structure that can be effectively exploited to improve the performance of multi-view coding schemes. In this paper we introduce a novel coding architecture that replaces the inter-view motion prediction operation with a 3D warping approach based on depth information to improve the coding......Free viewpoint video applications and autostereoscopic displays require the transmission of multiple views of a scene together with depth maps. Current compression and transmission solutions just handle these two data streams as separate entities. However, depth maps contain key information...

  13. An Efficient Code-Based Threshold Ring Signature Scheme with a Leader-Participant Model

    Directory of Open Access Journals (Sweden)

    Guomin Zhou

    2017-01-01

    Full Text Available Digital signature schemes with additional properties have broad applications, such as in protecting the identity of signers allowing a signer to anonymously sign a message in a group of signers (also known as a ring. While these number-theoretic problems are still secure at the time of this research, the situation could change with advances in quantum computing. There is a pressing need to design PKC schemes that are secure against quantum attacks. In this paper, we propose a novel code-based threshold ring signature scheme with a leader-participant model. A leader is appointed, who chooses some shared parameters for other signers to participate in the signing process. This leader-participant model enhances the performance because every participant including the leader could execute the decoding algorithm (as a part of signing process upon receiving the shared parameters from the leader. The time complexity of our scheme is close to Courtois et al.’s (2001 scheme. The latter is often used as a basis to construct other types of code-based signature schemes. Moreover, as a threshold ring signature scheme, our scheme is as efficient as the normal code-based ring signature.

  14. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  15. Efficient depth intraprediction method for H.264/AVC-based three-dimensional video coding

    Science.gov (United States)

    Oh, Kwan-Jung; Oh, Byung Tae

    2015-04-01

    We present an intracoding method that is applicable to depth map coding in multiview plus depth systems. Our approach combines skip prediction and plane segmentation-based prediction. The proposed depth intraskip prediction uses the estimated direction at both the encoder and decoder, and does not need to encode residual data. Our plane segmentation-based intraprediction divides the current block into biregions, and applies a different prediction scheme for each segmented region. This method avoids incorrect estimations across different regions, resulting in higher prediction accuracy. Simulation results demonstrate that the proposed scheme is superior to H.264/advanced video coding intraprediction and has the ability to improve the subjective rendering quality.

  16. The structural analysis of protein sequences based on the quasi-amino acids code

    International Nuclear Information System (INIS)

    Ping, Zhu; Xu-Qing, Tang; Zhen-Yuan, Xu

    2009-01-01

    Proteomics is the study of proteins and their interactions in a cell. With the successful completion of the Human Genome Project, it comes the postgenome era when the proteomics technology is emerging. This paper studies protein molecule from the algebraic point of view. The algebraic system (Σ, +, *) is introduced, where Σ is the set of 64 codons. According to the characteristics of (Σ, +, *), a novel quasi-amino acids code classification method is introduced and the corresponding algebraic operation table over the set ZU of the 16 kinds of quasi-amino acids is established. The internal relation is revealed about quasi-amino acids. The results show that there exist some very close correlations between the properties of the quasi-amino acids and the codon. All these correlation relationships may play an important part in establishing the logic relationship between codons and the quasi-amino acids during the course of life origination. According to Ma F et al (2003 J. Anhui Agricultural University 30 439), the corresponding relation and the excellent properties about amino acids code are very difficult to observe. The present paper shows that (ZU, ⊕, ) is a field. Furthermore, the operational results display that the codon tga has different property from other stop codons. In fact, in the mitochondrion from human and ox genomic codon, tga is just tryptophane, is not the stop codon like in other genetic code, it is the case of the Chen W C et al (2002 Acta Biophysica Sinica 18(1) 87). The present theory avoids some inexplicable events of the 20 kinds of amino acids code, in other words it solves the problem of 'the 64 codon assignments of mRNA to amino acids is probably completely wrong' proposed by Yang (2006 Progress in Modern Biomedicine 6 3). (cross-disciplinary physics and related areas of science and technology)

  17. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  18. Analysis of the Power oscillations event in Laguna Verde Nuclear Power Plant. Preliminary Report

    International Nuclear Information System (INIS)

    Gonzalez M, V.M.; Amador G, R.; Castillo, R.; Hernandez, J.L.

    1995-01-01

    The event occurred at Unit 1 of Laguna Verde Nuclear Power Plant in January 24, 1995, is analyzed using the Ramona 3 B code. During this event, Unit 1 suffered power oscillation when operating previous to the transfer at high speed recirculating pumps. This phenomenon was timely detected by reactor operator who put the reactor in shut-down doing a manual Scram. Oscillations reached a maximum extent of 10.5% of nominal power from peak to peak with a frequency of 0.5 Hz. Preliminary evaluations show that the event did not endangered the fuel integrity. The results of simulating the reactor core with Ramona 3 B code show that this code is capable to moderate reactor oscillations. Nevertheless it will be necessary to perform a more detailed simulation of the event in order to prove that the code can predict the beginning of oscillations. It will be need an additional analysis which permit the identification of factors that influence the reactor stability in order to express recommendations and in this way avoid the recurrence of this kind of events. (Author)

  19. Password Authentication Based on Fractal Coding Scheme

    Directory of Open Access Journals (Sweden)

    Nadia M. G. Al-Saidi

    2012-01-01

    Full Text Available Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image mechanisms. The advantage of fractal image coding is to be used to securely send the compressed image data through a nonsecured communication channel to the server. The verification of client information with the database system is achieved in the server to authenticate the legal user. The encrypted hashed password in the decoded fractal image is recognized using optical character recognition. The authentication process is performed after a successful verification of the client identity by comparing the decrypted hashed password with those which was stored in the database system. The system is analyzed and discussed from the attacker’s viewpoint. A security comparison is performed to show that the proposed scheme provides an essential security requirement, while their efficiency makes it easier to be applied alone or in hybrid with other security methods. Computer simulation and statistical analysis are presented.

  20. Surface acoustic wave coding for orthogonal frequency coded devices

    Science.gov (United States)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.